Mar 19 00:07:14 crc systemd[1]: Starting Kubernetes Kubelet... Mar 19 00:07:14 crc restorecon[4683]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 00:07:14 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 00:07:15 crc restorecon[4683]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 00:07:15 crc restorecon[4683]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 19 00:07:15 crc kubenswrapper[4745]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 00:07:15 crc kubenswrapper[4745]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 19 00:07:15 crc kubenswrapper[4745]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 00:07:15 crc kubenswrapper[4745]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 00:07:15 crc kubenswrapper[4745]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 19 00:07:15 crc kubenswrapper[4745]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.886535 4745 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892116 4745 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892151 4745 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892164 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892174 4745 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892185 4745 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892194 4745 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892204 4745 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892213 4745 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892221 4745 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892232 4745 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892242 4745 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892252 4745 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892261 4745 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892270 4745 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892279 4745 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892290 4745 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892299 4745 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892309 4745 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892318 4745 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892326 4745 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892334 4745 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892342 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892350 4745 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892357 4745 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892365 4745 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892380 4745 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892388 4745 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892396 4745 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892404 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892412 4745 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892421 4745 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892429 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892436 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892444 4745 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892453 4745 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892463 4745 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892471 4745 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892479 4745 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892487 4745 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892496 4745 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892506 4745 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892515 4745 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892524 4745 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892532 4745 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892539 4745 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892548 4745 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892556 4745 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892564 4745 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892572 4745 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892583 4745 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892592 4745 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892601 4745 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892610 4745 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892619 4745 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892628 4745 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892636 4745 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892646 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892655 4745 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892662 4745 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892670 4745 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892677 4745 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892685 4745 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892693 4745 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892700 4745 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892708 4745 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892715 4745 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892723 4745 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892731 4745 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892738 4745 feature_gate.go:330] unrecognized feature gate: Example Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892746 4745 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.892753 4745 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.892942 4745 flags.go:64] FLAG: --address="0.0.0.0" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.892960 4745 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.892976 4745 flags.go:64] FLAG: --anonymous-auth="true" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.892987 4745 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.892999 4745 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893009 4745 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893021 4745 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893032 4745 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893041 4745 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893050 4745 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893060 4745 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893070 4745 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893079 4745 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893088 4745 flags.go:64] FLAG: --cgroup-root="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893096 4745 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893106 4745 flags.go:64] FLAG: --client-ca-file="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893115 4745 flags.go:64] FLAG: --cloud-config="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893123 4745 flags.go:64] FLAG: --cloud-provider="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893132 4745 flags.go:64] FLAG: --cluster-dns="[]" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893144 4745 flags.go:64] FLAG: --cluster-domain="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893153 4745 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893164 4745 flags.go:64] FLAG: --config-dir="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893173 4745 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893185 4745 flags.go:64] FLAG: --container-log-max-files="5" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893198 4745 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893207 4745 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893217 4745 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893226 4745 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893236 4745 flags.go:64] FLAG: --contention-profiling="false" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893244 4745 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893254 4745 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893265 4745 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893274 4745 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893286 4745 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893294 4745 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893303 4745 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893313 4745 flags.go:64] FLAG: --enable-load-reader="false" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893323 4745 flags.go:64] FLAG: --enable-server="true" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893332 4745 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893346 4745 flags.go:64] FLAG: --event-burst="100" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893356 4745 flags.go:64] FLAG: --event-qps="50" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893365 4745 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893375 4745 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893385 4745 flags.go:64] FLAG: --eviction-hard="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893401 4745 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893410 4745 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893419 4745 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893429 4745 flags.go:64] FLAG: --eviction-soft="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893438 4745 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893447 4745 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893456 4745 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893465 4745 flags.go:64] FLAG: --experimental-mounter-path="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893474 4745 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893483 4745 flags.go:64] FLAG: --fail-swap-on="true" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893492 4745 flags.go:64] FLAG: --feature-gates="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893504 4745 flags.go:64] FLAG: --file-check-frequency="20s" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893513 4745 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893523 4745 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893532 4745 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893541 4745 flags.go:64] FLAG: --healthz-port="10248" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893550 4745 flags.go:64] FLAG: --help="false" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893559 4745 flags.go:64] FLAG: --hostname-override="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893569 4745 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893578 4745 flags.go:64] FLAG: --http-check-frequency="20s" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893587 4745 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893597 4745 flags.go:64] FLAG: --image-credential-provider-config="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893607 4745 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893617 4745 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893627 4745 flags.go:64] FLAG: --image-service-endpoint="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893636 4745 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893645 4745 flags.go:64] FLAG: --kube-api-burst="100" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893654 4745 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893663 4745 flags.go:64] FLAG: --kube-api-qps="50" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893676 4745 flags.go:64] FLAG: --kube-reserved="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893685 4745 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893693 4745 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893703 4745 flags.go:64] FLAG: --kubelet-cgroups="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893711 4745 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893720 4745 flags.go:64] FLAG: --lock-file="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893729 4745 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893738 4745 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893746 4745 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893759 4745 flags.go:64] FLAG: --log-json-split-stream="false" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893769 4745 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893777 4745 flags.go:64] FLAG: --log-text-split-stream="false" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893786 4745 flags.go:64] FLAG: --logging-format="text" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893795 4745 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893806 4745 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893815 4745 flags.go:64] FLAG: --manifest-url="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893824 4745 flags.go:64] FLAG: --manifest-url-header="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893836 4745 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893845 4745 flags.go:64] FLAG: --max-open-files="1000000" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893857 4745 flags.go:64] FLAG: --max-pods="110" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893866 4745 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893875 4745 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893907 4745 flags.go:64] FLAG: --memory-manager-policy="None" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893916 4745 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893926 4745 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893936 4745 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893947 4745 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893967 4745 flags.go:64] FLAG: --node-status-max-images="50" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893976 4745 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893985 4745 flags.go:64] FLAG: --oom-score-adj="-999" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.893994 4745 flags.go:64] FLAG: --pod-cidr="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894005 4745 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894020 4745 flags.go:64] FLAG: --pod-manifest-path="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894029 4745 flags.go:64] FLAG: --pod-max-pids="-1" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894038 4745 flags.go:64] FLAG: --pods-per-core="0" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894047 4745 flags.go:64] FLAG: --port="10250" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894072 4745 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894081 4745 flags.go:64] FLAG: --provider-id="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894090 4745 flags.go:64] FLAG: --qos-reserved="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894099 4745 flags.go:64] FLAG: --read-only-port="10255" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894108 4745 flags.go:64] FLAG: --register-node="true" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894117 4745 flags.go:64] FLAG: --register-schedulable="true" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894126 4745 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894141 4745 flags.go:64] FLAG: --registry-burst="10" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894149 4745 flags.go:64] FLAG: --registry-qps="5" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894158 4745 flags.go:64] FLAG: --reserved-cpus="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894168 4745 flags.go:64] FLAG: --reserved-memory="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894178 4745 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894188 4745 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894197 4745 flags.go:64] FLAG: --rotate-certificates="false" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894206 4745 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894216 4745 flags.go:64] FLAG: --runonce="false" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894225 4745 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894236 4745 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894245 4745 flags.go:64] FLAG: --seccomp-default="false" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894253 4745 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894262 4745 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894272 4745 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894281 4745 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894290 4745 flags.go:64] FLAG: --storage-driver-password="root" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894299 4745 flags.go:64] FLAG: --storage-driver-secure="false" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894308 4745 flags.go:64] FLAG: --storage-driver-table="stats" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894317 4745 flags.go:64] FLAG: --storage-driver-user="root" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894325 4745 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894335 4745 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894344 4745 flags.go:64] FLAG: --system-cgroups="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894353 4745 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894367 4745 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894375 4745 flags.go:64] FLAG: --tls-cert-file="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894384 4745 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894395 4745 flags.go:64] FLAG: --tls-min-version="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894404 4745 flags.go:64] FLAG: --tls-private-key-file="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894425 4745 flags.go:64] FLAG: --topology-manager-policy="none" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894434 4745 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894443 4745 flags.go:64] FLAG: --topology-manager-scope="container" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894453 4745 flags.go:64] FLAG: --v="2" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894468 4745 flags.go:64] FLAG: --version="false" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894483 4745 flags.go:64] FLAG: --vmodule="" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894499 4745 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.894512 4745 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894794 4745 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894807 4745 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894817 4745 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894827 4745 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894836 4745 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894846 4745 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894855 4745 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894864 4745 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894873 4745 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894906 4745 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894915 4745 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894923 4745 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894931 4745 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894939 4745 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894947 4745 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894954 4745 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894962 4745 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894973 4745 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894983 4745 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894991 4745 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.894999 4745 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895007 4745 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895015 4745 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895025 4745 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895036 4745 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895045 4745 feature_gate.go:330] unrecognized feature gate: Example Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895052 4745 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895060 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895081 4745 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895089 4745 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895097 4745 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895106 4745 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895114 4745 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895121 4745 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895129 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895137 4745 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895145 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895152 4745 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895161 4745 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895171 4745 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895179 4745 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895187 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895194 4745 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895202 4745 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895209 4745 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895217 4745 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895225 4745 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895232 4745 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895242 4745 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895250 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895258 4745 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895266 4745 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895274 4745 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895281 4745 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895289 4745 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895297 4745 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895305 4745 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895315 4745 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895326 4745 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895335 4745 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895345 4745 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895353 4745 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895362 4745 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895370 4745 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895391 4745 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895411 4745 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895420 4745 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895428 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895436 4745 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895444 4745 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.895452 4745 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.895477 4745 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.911409 4745 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.911461 4745 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911601 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911622 4745 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911631 4745 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911642 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911651 4745 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911659 4745 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911667 4745 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911675 4745 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911683 4745 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911691 4745 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911701 4745 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911714 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911723 4745 feature_gate.go:330] unrecognized feature gate: Example Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911732 4745 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911741 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911749 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911757 4745 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911765 4745 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911774 4745 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911781 4745 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911789 4745 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911800 4745 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911810 4745 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911820 4745 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911830 4745 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911838 4745 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911846 4745 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911854 4745 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911864 4745 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911875 4745 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911908 4745 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911916 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911924 4745 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911932 4745 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911943 4745 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911951 4745 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911958 4745 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911966 4745 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911973 4745 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911981 4745 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911989 4745 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.911997 4745 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912005 4745 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912012 4745 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912020 4745 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912030 4745 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912040 4745 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912049 4745 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912058 4745 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912068 4745 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912076 4745 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912084 4745 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912092 4745 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912100 4745 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912107 4745 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912115 4745 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912123 4745 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912131 4745 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912139 4745 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912146 4745 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912155 4745 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912164 4745 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912171 4745 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912180 4745 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912188 4745 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912196 4745 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912204 4745 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912211 4745 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912219 4745 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912227 4745 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912235 4745 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.912250 4745 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912469 4745 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912484 4745 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912493 4745 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912501 4745 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912509 4745 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912520 4745 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912531 4745 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912540 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912549 4745 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912558 4745 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912566 4745 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912577 4745 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912587 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912595 4745 feature_gate.go:330] unrecognized feature gate: Example Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912604 4745 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912612 4745 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912620 4745 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912629 4745 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912637 4745 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912645 4745 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912653 4745 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912661 4745 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912668 4745 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912676 4745 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912685 4745 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912695 4745 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912703 4745 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912711 4745 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912719 4745 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912727 4745 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912735 4745 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912742 4745 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912750 4745 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912757 4745 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912766 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912775 4745 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912782 4745 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912790 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912798 4745 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912805 4745 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912813 4745 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912821 4745 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912829 4745 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912838 4745 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912845 4745 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912856 4745 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912866 4745 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912875 4745 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912904 4745 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912912 4745 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912921 4745 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912930 4745 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912939 4745 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912949 4745 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912957 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912965 4745 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912973 4745 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912981 4745 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912989 4745 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.912997 4745 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.913004 4745 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.913013 4745 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.913020 4745 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.913028 4745 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.913035 4745 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.913043 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.913051 4745 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.913059 4745 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.913067 4745 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.913074 4745 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 00:07:15 crc kubenswrapper[4745]: W0319 00:07:15.913083 4745 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.913097 4745 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.914092 4745 server.go:940] "Client rotation is on, will bootstrap in background" Mar 19 00:07:15 crc kubenswrapper[4745]: E0319 00:07:15.919260 4745 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.922951 4745 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.923062 4745 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.926130 4745 server.go:997] "Starting client certificate rotation" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.926157 4745 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.926374 4745 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.952320 4745 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.957741 4745 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 00:07:15 crc kubenswrapper[4745]: E0319 00:07:15.959064 4745 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 19 00:07:15 crc kubenswrapper[4745]: I0319 00:07:15.972966 4745 log.go:25] "Validated CRI v1 runtime API" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.018795 4745 log.go:25] "Validated CRI v1 image API" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.021023 4745 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.027447 4745 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-19-00-00-55-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.027504 4745 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.051042 4745 manager.go:217] Machine: {Timestamp:2026-03-19 00:07:16.045909777 +0000 UTC m=+0.584104948 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:a46e9744-0e29-4d5f-ba27-ee05bebca43c BootID:f552003b-21de-4401-adf9-0568d3518be8 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:4c:81:5d Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:4c:81:5d Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:29:48:13 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:93:8e:ce Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:81:d8:9a Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:f4:b1:70 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:8e:d1:5d:63:de:c2 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:52:69:58:eb:c8:9c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.051502 4745 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.051821 4745 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.053709 4745 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.053982 4745 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.054029 4745 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.054275 4745 topology_manager.go:138] "Creating topology manager with none policy" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.054288 4745 container_manager_linux.go:303] "Creating device plugin manager" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.055091 4745 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.055135 4745 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.055399 4745 state_mem.go:36] "Initialized new in-memory state store" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.055508 4745 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.059700 4745 kubelet.go:418] "Attempting to sync node with API server" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.059732 4745 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.059796 4745 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.059822 4745 kubelet.go:324] "Adding apiserver pod source" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.059839 4745 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.063980 4745 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.067044 4745 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 19 00:07:16 crc kubenswrapper[4745]: W0319 00:07:16.067768 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 19 00:07:16 crc kubenswrapper[4745]: W0319 00:07:16.067898 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 19 00:07:16 crc kubenswrapper[4745]: E0319 00:07:16.067971 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 19 00:07:16 crc kubenswrapper[4745]: E0319 00:07:16.067996 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.070097 4745 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.076828 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.076874 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.076897 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.076906 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.076919 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.076928 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.076937 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.076949 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.076960 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.076969 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.076981 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.076988 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.077021 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.077569 4745 server.go:1280] "Started kubelet" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.078435 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 19 00:07:16 crc systemd[1]: Started Kubernetes Kubelet. Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.080151 4745 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.080192 4745 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.081067 4745 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.082290 4745 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.082514 4745 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.082681 4745 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.082704 4745 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 19 00:07:16 crc kubenswrapper[4745]: E0319 00:07:16.083131 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.083467 4745 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 19 00:07:16 crc kubenswrapper[4745]: E0319 00:07:16.082694 4745 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e15672afc1300 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.077531904 +0000 UTC m=+0.615727035,LastTimestamp:2026-03-19 00:07:16.077531904 +0000 UTC m=+0.615727035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:16 crc kubenswrapper[4745]: E0319 00:07:16.084073 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="200ms" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.084954 4745 factory.go:55] Registering systemd factory Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.084988 4745 factory.go:221] Registration of the systemd container factory successfully Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.085401 4745 factory.go:153] Registering CRI-O factory Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.085440 4745 factory.go:221] Registration of the crio container factory successfully Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.085555 4745 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.085590 4745 factory.go:103] Registering Raw factory Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.085619 4745 manager.go:1196] Started watching for new ooms in manager Mar 19 00:07:16 crc kubenswrapper[4745]: W0319 00:07:16.085933 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 19 00:07:16 crc kubenswrapper[4745]: E0319 00:07:16.086169 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.086610 4745 manager.go:319] Starting recovery of all containers Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.086688 4745 server.go:460] "Adding debug handlers to kubelet server" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.097118 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.097219 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.097244 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.097263 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.097288 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.097318 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.097347 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.097371 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098520 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098544 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098569 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098589 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098615 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098638 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098658 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098674 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098697 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098713 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098732 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098757 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098776 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098798 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098817 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098839 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098862 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098902 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098933 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098958 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.098982 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099000 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099029 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099051 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099081 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099103 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099490 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099516 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099542 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099575 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099595 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099616 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099643 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099667 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099685 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099710 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099731 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099760 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099779 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099798 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099825 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099847 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099872 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099908 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099943 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.099969 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.100946 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.100987 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.101009 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.101027 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.101042 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.101060 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.101079 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.101097 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.101115 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.101133 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.101151 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.101168 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.101185 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.101203 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.101224 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.101250 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.101274 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.101294 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103146 4745 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103189 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103208 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103225 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103247 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103266 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103285 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103324 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103365 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103381 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103399 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103417 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103433 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103450 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103467 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103484 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103501 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103536 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103560 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103598 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103618 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103635 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103651 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103668 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103686 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103702 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103720 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103736 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103753 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103772 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103789 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103806 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103822 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103848 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103866 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103910 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103953 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103972 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.103989 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104007 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104025 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104046 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104094 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104114 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104133 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104151 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104165 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104181 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104198 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104214 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104231 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104247 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104265 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104304 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104326 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104349 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104371 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104393 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104412 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104433 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104457 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104479 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104500 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104522 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104542 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104562 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104585 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104631 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104652 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104673 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104693 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104715 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104736 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104756 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104777 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104798 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104818 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104840 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104861 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104908 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104932 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104954 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104977 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.104999 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105025 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105046 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105069 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105089 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105108 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105132 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105155 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105177 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105198 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105219 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105239 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105261 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105282 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105303 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105323 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105343 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105365 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105386 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105407 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105427 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105449 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105471 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105493 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105518 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105540 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105561 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105580 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105600 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105630 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105650 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105670 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105690 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105715 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105736 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105758 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105781 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105801 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105820 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105838 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105861 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105908 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105929 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105949 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105969 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.105989 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.106009 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.106030 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.106052 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.106074 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.106093 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.106126 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.106146 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.106162 4745 reconstruct.go:97] "Volume reconstruction finished" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.106174 4745 reconciler.go:26] "Reconciler: start to sync state" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.111645 4745 manager.go:324] Recovery completed Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.127567 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.129579 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.129638 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.129662 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.130382 4745 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.130397 4745 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.130422 4745 state_mem.go:36] "Initialized new in-memory state store" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.134544 4745 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.136429 4745 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.136467 4745 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.136499 4745 kubelet.go:2335] "Starting kubelet main sync loop" Mar 19 00:07:16 crc kubenswrapper[4745]: E0319 00:07:16.136548 4745 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 19 00:07:16 crc kubenswrapper[4745]: W0319 00:07:16.137312 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 19 00:07:16 crc kubenswrapper[4745]: E0319 00:07:16.137388 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.154057 4745 policy_none.go:49] "None policy: Start" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.155055 4745 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.155091 4745 state_mem.go:35] "Initializing new in-memory state store" Mar 19 00:07:16 crc kubenswrapper[4745]: E0319 00:07:16.183795 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.212329 4745 manager.go:334] "Starting Device Plugin manager" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.212402 4745 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.212418 4745 server.go:79] "Starting device plugin registration server" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.212918 4745 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.212941 4745 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.213679 4745 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.213790 4745 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.213803 4745 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 19 00:07:16 crc kubenswrapper[4745]: E0319 00:07:16.220427 4745 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.236930 4745 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.237105 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.238693 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.238735 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.238747 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.238938 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.239118 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.239159 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.240100 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.240126 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.240135 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.240209 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.240212 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.240264 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.240278 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.240334 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.240381 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.240789 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.240813 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.240822 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.240928 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.241164 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.241231 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.241453 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.241482 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.241491 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.241576 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.241828 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.241931 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.243239 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.243287 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.243295 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.243326 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.243339 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.243347 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.243364 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.243373 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.243302 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.243596 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.243623 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.243957 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.244007 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.244018 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.245652 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.245678 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.245696 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:16 crc kubenswrapper[4745]: E0319 00:07:16.284909 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="400ms" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.308270 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.308454 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.308551 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.308657 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.308808 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.308850 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.308871 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.308905 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.308927 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.308946 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.308965 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.308982 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.309003 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.309020 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.309058 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.313292 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.314568 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.314672 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.314731 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.314803 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:07:16 crc kubenswrapper[4745]: E0319 00:07:16.315938 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.410689 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.410832 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.410863 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.410963 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411041 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411102 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411135 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411198 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411237 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411306 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411358 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411383 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411406 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411460 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411547 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411684 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411802 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411840 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411869 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411913 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411919 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411869 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411959 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411960 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411867 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411732 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411993 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.411982 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.412071 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.412063 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.516444 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.517838 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.517872 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.517901 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.517925 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:07:16 crc kubenswrapper[4745]: E0319 00:07:16.518403 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.563032 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.570674 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.588171 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: W0319 00:07:16.605202 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-6200288d9e66bb5d17a0dd1297e8bfecbd1aa230ad5fffbba294ca32d31ab886 WatchSource:0}: Error finding container 6200288d9e66bb5d17a0dd1297e8bfecbd1aa230ad5fffbba294ca32d31ab886: Status 404 returned error can't find the container with id 6200288d9e66bb5d17a0dd1297e8bfecbd1aa230ad5fffbba294ca32d31ab886 Mar 19 00:07:16 crc kubenswrapper[4745]: W0319 00:07:16.605653 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-aa7e6c8d5d72a0ed060f8788ef5ec25862cba99f7e79fb35101ddf29c9d40c07 WatchSource:0}: Error finding container aa7e6c8d5d72a0ed060f8788ef5ec25862cba99f7e79fb35101ddf29c9d40c07: Status 404 returned error can't find the container with id aa7e6c8d5d72a0ed060f8788ef5ec25862cba99f7e79fb35101ddf29c9d40c07 Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.607474 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: W0319 00:07:16.609666 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-c8d324d5c03ed3f713c73cb41cdd96b0cc52d7d5a31a71b7cb5a9ffb6ea7c737 WatchSource:0}: Error finding container c8d324d5c03ed3f713c73cb41cdd96b0cc52d7d5a31a71b7cb5a9ffb6ea7c737: Status 404 returned error can't find the container with id c8d324d5c03ed3f713c73cb41cdd96b0cc52d7d5a31a71b7cb5a9ffb6ea7c737 Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.611646 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:16 crc kubenswrapper[4745]: W0319 00:07:16.619779 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-8818671e6f6a328c749bbc892a5985e6c602a651f15cf658ae347901f52cf6e5 WatchSource:0}: Error finding container 8818671e6f6a328c749bbc892a5985e6c602a651f15cf658ae347901f52cf6e5: Status 404 returned error can't find the container with id 8818671e6f6a328c749bbc892a5985e6c602a651f15cf658ae347901f52cf6e5 Mar 19 00:07:16 crc kubenswrapper[4745]: W0319 00:07:16.631649 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-4f783fa99c28a384cf49c43259891d42b167b8816b560ce856eb2c39fd927bbf WatchSource:0}: Error finding container 4f783fa99c28a384cf49c43259891d42b167b8816b560ce856eb2c39fd927bbf: Status 404 returned error can't find the container with id 4f783fa99c28a384cf49c43259891d42b167b8816b560ce856eb2c39fd927bbf Mar 19 00:07:16 crc kubenswrapper[4745]: E0319 00:07:16.685851 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="800ms" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.918646 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.922277 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.922323 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.922334 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:16 crc kubenswrapper[4745]: I0319 00:07:16.922362 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:07:16 crc kubenswrapper[4745]: E0319 00:07:16.922785 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Mar 19 00:07:17 crc kubenswrapper[4745]: W0319 00:07:17.061485 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 19 00:07:17 crc kubenswrapper[4745]: E0319 00:07:17.061594 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 19 00:07:17 crc kubenswrapper[4745]: I0319 00:07:17.079209 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 19 00:07:17 crc kubenswrapper[4745]: I0319 00:07:17.144152 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"aa7e6c8d5d72a0ed060f8788ef5ec25862cba99f7e79fb35101ddf29c9d40c07"} Mar 19 00:07:17 crc kubenswrapper[4745]: I0319 00:07:17.147096 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6200288d9e66bb5d17a0dd1297e8bfecbd1aa230ad5fffbba294ca32d31ab886"} Mar 19 00:07:17 crc kubenswrapper[4745]: I0319 00:07:17.149717 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4f783fa99c28a384cf49c43259891d42b167b8816b560ce856eb2c39fd927bbf"} Mar 19 00:07:17 crc kubenswrapper[4745]: I0319 00:07:17.150845 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8818671e6f6a328c749bbc892a5985e6c602a651f15cf658ae347901f52cf6e5"} Mar 19 00:07:17 crc kubenswrapper[4745]: I0319 00:07:17.151834 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c8d324d5c03ed3f713c73cb41cdd96b0cc52d7d5a31a71b7cb5a9ffb6ea7c737"} Mar 19 00:07:17 crc kubenswrapper[4745]: W0319 00:07:17.216565 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 19 00:07:17 crc kubenswrapper[4745]: E0319 00:07:17.216654 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 19 00:07:17 crc kubenswrapper[4745]: W0319 00:07:17.379109 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 19 00:07:17 crc kubenswrapper[4745]: E0319 00:07:17.379204 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 19 00:07:17 crc kubenswrapper[4745]: E0319 00:07:17.487607 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="1.6s" Mar 19 00:07:17 crc kubenswrapper[4745]: W0319 00:07:17.551222 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 19 00:07:17 crc kubenswrapper[4745]: E0319 00:07:17.551325 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 19 00:07:17 crc kubenswrapper[4745]: I0319 00:07:17.723251 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:17 crc kubenswrapper[4745]: I0319 00:07:17.725117 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:17 crc kubenswrapper[4745]: I0319 00:07:17.725163 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:17 crc kubenswrapper[4745]: I0319 00:07:17.725173 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:17 crc kubenswrapper[4745]: I0319 00:07:17.725201 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:07:17 crc kubenswrapper[4745]: E0319 00:07:17.725832 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.006285 4745 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 00:07:18 crc kubenswrapper[4745]: E0319 00:07:18.007732 4745 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.080224 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.159054 4745 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d" exitCode=0 Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.159157 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d"} Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.159244 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.160446 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.160483 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.160496 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.161591 4745 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547" exitCode=0 Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.161723 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.161865 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547"} Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.162392 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.162412 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.162420 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.167499 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903"} Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.167565 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686"} Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.167583 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1"} Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.167596 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb"} Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.167751 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.169725 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1" exitCode=0 Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.169823 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1"} Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.169869 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.169976 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.170018 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.170035 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.171152 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.171191 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.171208 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.172638 4745 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="99828ce30b7a765df950ffb0a316825bebe1038153a10bb33c4de32049c7a208" exitCode=0 Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.172696 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"99828ce30b7a765df950ffb0a316825bebe1038153a10bb33c4de32049c7a208"} Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.172807 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.173872 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.173914 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.173925 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.176710 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.186245 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.186310 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:18 crc kubenswrapper[4745]: I0319 00:07:18.186340 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.080182 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 19 00:07:19 crc kubenswrapper[4745]: E0319 00:07:19.088844 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="3.2s" Mar 19 00:07:19 crc kubenswrapper[4745]: W0319 00:07:19.120650 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 19 00:07:19 crc kubenswrapper[4745]: E0319 00:07:19.120819 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.183128 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5"} Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.183189 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210"} Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.183202 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77"} Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.183216 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6"} Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.185960 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9bce0685695f5cbfdfc06fd9efbb8ec07c6229ddc45a90eb0c85d6af7dabf530"} Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.186043 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.191012 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.191053 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.191066 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.192652 4745 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0" exitCode=0 Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.192723 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0"} Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.192902 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.194284 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.194320 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.194330 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.196494 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b7d5ed9af1438fc22d155095c9b2e08c97ea91980ed0beeaf490b6112ee5196a"} Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.196532 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"769ed961b9d7f88a7b4417bc5a6c3f979cb0025b8b9fde3490b3af2170038f0e"} Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.196556 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7b1e8257c05a50ecf277d4c2c8f7a4d5a5607ab9d22034aa9eca904fd8fad008"} Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.196625 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.196631 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.198321 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.198353 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.198366 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.198443 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.198466 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.198478 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.326591 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.328056 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.328105 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.328121 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:19 crc kubenswrapper[4745]: I0319 00:07:19.328155 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:07:19 crc kubenswrapper[4745]: E0319 00:07:19.328743 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Mar 19 00:07:19 crc kubenswrapper[4745]: W0319 00:07:19.441638 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 19 00:07:19 crc kubenswrapper[4745]: E0319 00:07:19.441737 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 19 00:07:19 crc kubenswrapper[4745]: W0319 00:07:19.591086 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 19 00:07:19 crc kubenswrapper[4745]: E0319 00:07:19.591204 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 19 00:07:20 crc kubenswrapper[4745]: I0319 00:07:20.202947 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"29301588f7ab9acc40a0c8dfda80e8bc9540a0dbaed93d5aecec7b5c45725a25"} Mar 19 00:07:20 crc kubenswrapper[4745]: I0319 00:07:20.203078 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:20 crc kubenswrapper[4745]: I0319 00:07:20.204199 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:20 crc kubenswrapper[4745]: I0319 00:07:20.204270 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:20 crc kubenswrapper[4745]: I0319 00:07:20.204295 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:20 crc kubenswrapper[4745]: I0319 00:07:20.205111 4745 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102" exitCode=0 Mar 19 00:07:20 crc kubenswrapper[4745]: I0319 00:07:20.205201 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:20 crc kubenswrapper[4745]: I0319 00:07:20.205229 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102"} Mar 19 00:07:20 crc kubenswrapper[4745]: I0319 00:07:20.205234 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:20 crc kubenswrapper[4745]: I0319 00:07:20.205302 4745 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 00:07:20 crc kubenswrapper[4745]: I0319 00:07:20.205333 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:20 crc kubenswrapper[4745]: I0319 00:07:20.206292 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:20 crc kubenswrapper[4745]: I0319 00:07:20.206318 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:20 crc kubenswrapper[4745]: I0319 00:07:20.206329 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:20 crc kubenswrapper[4745]: I0319 00:07:20.206337 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:20 crc kubenswrapper[4745]: I0319 00:07:20.206373 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:20 crc kubenswrapper[4745]: I0319 00:07:20.206384 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:20 crc kubenswrapper[4745]: I0319 00:07:20.206542 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:20 crc kubenswrapper[4745]: I0319 00:07:20.206561 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:20 crc kubenswrapper[4745]: I0319 00:07:20.206569 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:21 crc kubenswrapper[4745]: I0319 00:07:21.212061 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11"} Mar 19 00:07:21 crc kubenswrapper[4745]: I0319 00:07:21.212111 4745 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 00:07:21 crc kubenswrapper[4745]: I0319 00:07:21.212167 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:21 crc kubenswrapper[4745]: I0319 00:07:21.212115 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6"} Mar 19 00:07:21 crc kubenswrapper[4745]: I0319 00:07:21.212211 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4"} Mar 19 00:07:21 crc kubenswrapper[4745]: I0319 00:07:21.212236 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba"} Mar 19 00:07:21 crc kubenswrapper[4745]: I0319 00:07:21.213588 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:21 crc kubenswrapper[4745]: I0319 00:07:21.213649 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:21 crc kubenswrapper[4745]: I0319 00:07:21.213662 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:21 crc kubenswrapper[4745]: I0319 00:07:21.533532 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:21 crc kubenswrapper[4745]: I0319 00:07:21.533764 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:21 crc kubenswrapper[4745]: I0319 00:07:21.535093 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:21 crc kubenswrapper[4745]: I0319 00:07:21.535183 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:21 crc kubenswrapper[4745]: I0319 00:07:21.535203 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.219141 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a"} Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.219413 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.220655 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.220698 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.220716 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.285024 4745 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.526148 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.526424 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.527966 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.528014 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.528031 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.529072 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.529932 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.529957 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.529966 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.529983 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.656698 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.656941 4745 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.657043 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.658437 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.658480 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:22 crc kubenswrapper[4745]: I0319 00:07:22.658493 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:23 crc kubenswrapper[4745]: I0319 00:07:23.221768 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:23 crc kubenswrapper[4745]: I0319 00:07:23.223372 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:23 crc kubenswrapper[4745]: I0319 00:07:23.223448 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:23 crc kubenswrapper[4745]: I0319 00:07:23.223464 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:23 crc kubenswrapper[4745]: I0319 00:07:23.458083 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:23 crc kubenswrapper[4745]: I0319 00:07:23.458285 4745 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 00:07:23 crc kubenswrapper[4745]: I0319 00:07:23.458344 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:23 crc kubenswrapper[4745]: I0319 00:07:23.459959 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:23 crc kubenswrapper[4745]: I0319 00:07:23.460000 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:23 crc kubenswrapper[4745]: I0319 00:07:23.460011 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:23 crc kubenswrapper[4745]: I0319 00:07:23.507744 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 19 00:07:24 crc kubenswrapper[4745]: I0319 00:07:24.224841 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:24 crc kubenswrapper[4745]: I0319 00:07:24.226416 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:24 crc kubenswrapper[4745]: I0319 00:07:24.226481 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:24 crc kubenswrapper[4745]: I0319 00:07:24.226505 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:25 crc kubenswrapper[4745]: I0319 00:07:25.107587 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:25 crc kubenswrapper[4745]: I0319 00:07:25.107968 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:25 crc kubenswrapper[4745]: I0319 00:07:25.109410 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:25 crc kubenswrapper[4745]: I0319 00:07:25.109459 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:25 crc kubenswrapper[4745]: I0319 00:07:25.109476 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:26 crc kubenswrapper[4745]: E0319 00:07:26.220652 4745 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 00:07:26 crc kubenswrapper[4745]: I0319 00:07:26.273487 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:26 crc kubenswrapper[4745]: I0319 00:07:26.273826 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:26 crc kubenswrapper[4745]: I0319 00:07:26.275550 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:26 crc kubenswrapper[4745]: I0319 00:07:26.275603 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:26 crc kubenswrapper[4745]: I0319 00:07:26.275620 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:26 crc kubenswrapper[4745]: I0319 00:07:26.546344 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 00:07:26 crc kubenswrapper[4745]: I0319 00:07:26.546662 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:26 crc kubenswrapper[4745]: I0319 00:07:26.548443 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:26 crc kubenswrapper[4745]: I0319 00:07:26.548491 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:26 crc kubenswrapper[4745]: I0319 00:07:26.548512 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:27 crc kubenswrapper[4745]: I0319 00:07:27.981245 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:27 crc kubenswrapper[4745]: I0319 00:07:27.981517 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:27 crc kubenswrapper[4745]: I0319 00:07:27.983161 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:27 crc kubenswrapper[4745]: I0319 00:07:27.983239 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:27 crc kubenswrapper[4745]: I0319 00:07:27.983268 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:27 crc kubenswrapper[4745]: I0319 00:07:27.989785 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:28 crc kubenswrapper[4745]: I0319 00:07:28.235940 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:28 crc kubenswrapper[4745]: I0319 00:07:28.237791 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:28 crc kubenswrapper[4745]: I0319 00:07:28.237858 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:28 crc kubenswrapper[4745]: I0319 00:07:28.237913 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:28 crc kubenswrapper[4745]: I0319 00:07:28.243621 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:29 crc kubenswrapper[4745]: I0319 00:07:29.238282 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:29 crc kubenswrapper[4745]: I0319 00:07:29.239121 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:29 crc kubenswrapper[4745]: I0319 00:07:29.239168 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:29 crc kubenswrapper[4745]: I0319 00:07:29.239179 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:29 crc kubenswrapper[4745]: I0319 00:07:29.273703 4745 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 00:07:29 crc kubenswrapper[4745]: I0319 00:07:29.273789 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 00:07:29 crc kubenswrapper[4745]: I0319 00:07:29.825815 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:29Z is after 2026-02-23T05:33:13Z Mar 19 00:07:29 crc kubenswrapper[4745]: W0319 00:07:29.827373 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:29Z is after 2026-02-23T05:33:13Z Mar 19 00:07:29 crc kubenswrapper[4745]: E0319 00:07:29.827464 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 00:07:29 crc kubenswrapper[4745]: W0319 00:07:29.830491 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:29Z is after 2026-02-23T05:33:13Z Mar 19 00:07:29 crc kubenswrapper[4745]: E0319 00:07:29.830572 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 00:07:29 crc kubenswrapper[4745]: E0319 00:07:29.831274 4745 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:29Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e15672afc1300 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.077531904 +0000 UTC m=+0.615727035,LastTimestamp:2026-03-19 00:07:16.077531904 +0000 UTC m=+0.615727035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:29 crc kubenswrapper[4745]: W0319 00:07:29.832831 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:29Z is after 2026-02-23T05:33:13Z Mar 19 00:07:29 crc kubenswrapper[4745]: E0319 00:07:29.832893 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 00:07:29 crc kubenswrapper[4745]: E0319 00:07:29.834291 4745 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 00:07:29 crc kubenswrapper[4745]: E0319 00:07:29.835680 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:29Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 00:07:29 crc kubenswrapper[4745]: I0319 00:07:29.837056 4745 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 00:07:29 crc kubenswrapper[4745]: I0319 00:07:29.837126 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 19 00:07:29 crc kubenswrapper[4745]: W0319 00:07:29.838804 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:29Z is after 2026-02-23T05:33:13Z Mar 19 00:07:29 crc kubenswrapper[4745]: E0319 00:07:29.838934 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 00:07:29 crc kubenswrapper[4745]: E0319 00:07:29.840484 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:29Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 19 00:07:29 crc kubenswrapper[4745]: I0319 00:07:29.843233 4745 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 00:07:29 crc kubenswrapper[4745]: I0319 00:07:29.843308 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 19 00:07:29 crc kubenswrapper[4745]: I0319 00:07:29.925538 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 19 00:07:29 crc kubenswrapper[4745]: I0319 00:07:29.925725 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:29 crc kubenswrapper[4745]: I0319 00:07:29.926935 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:29 crc kubenswrapper[4745]: I0319 00:07:29.926985 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:29 crc kubenswrapper[4745]: I0319 00:07:29.927010 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:30 crc kubenswrapper[4745]: I0319 00:07:30.082129 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:30Z is after 2026-02-23T05:33:13Z Mar 19 00:07:30 crc kubenswrapper[4745]: I0319 00:07:30.243019 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 00:07:30 crc kubenswrapper[4745]: I0319 00:07:30.245223 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="29301588f7ab9acc40a0c8dfda80e8bc9540a0dbaed93d5aecec7b5c45725a25" exitCode=255 Mar 19 00:07:30 crc kubenswrapper[4745]: I0319 00:07:30.245308 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"29301588f7ab9acc40a0c8dfda80e8bc9540a0dbaed93d5aecec7b5c45725a25"} Mar 19 00:07:30 crc kubenswrapper[4745]: I0319 00:07:30.245528 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:30 crc kubenswrapper[4745]: I0319 00:07:30.246704 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:30 crc kubenswrapper[4745]: I0319 00:07:30.246748 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:30 crc kubenswrapper[4745]: I0319 00:07:30.246764 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:30 crc kubenswrapper[4745]: I0319 00:07:30.247412 4745 scope.go:117] "RemoveContainer" containerID="29301588f7ab9acc40a0c8dfda80e8bc9540a0dbaed93d5aecec7b5c45725a25" Mar 19 00:07:31 crc kubenswrapper[4745]: I0319 00:07:31.084350 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:31Z is after 2026-02-23T05:33:13Z Mar 19 00:07:31 crc kubenswrapper[4745]: I0319 00:07:31.252352 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 00:07:31 crc kubenswrapper[4745]: I0319 00:07:31.253217 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 00:07:31 crc kubenswrapper[4745]: I0319 00:07:31.256291 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7195f8b84311a94509530fca982c585b812e4ba21c18f5e474167c1e34a5daa8" exitCode=255 Mar 19 00:07:31 crc kubenswrapper[4745]: I0319 00:07:31.256360 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7195f8b84311a94509530fca982c585b812e4ba21c18f5e474167c1e34a5daa8"} Mar 19 00:07:31 crc kubenswrapper[4745]: I0319 00:07:31.256483 4745 scope.go:117] "RemoveContainer" containerID="29301588f7ab9acc40a0c8dfda80e8bc9540a0dbaed93d5aecec7b5c45725a25" Mar 19 00:07:31 crc kubenswrapper[4745]: I0319 00:07:31.256770 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:31 crc kubenswrapper[4745]: I0319 00:07:31.258663 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:31 crc kubenswrapper[4745]: I0319 00:07:31.258720 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:31 crc kubenswrapper[4745]: I0319 00:07:31.258739 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:31 crc kubenswrapper[4745]: I0319 00:07:31.259681 4745 scope.go:117] "RemoveContainer" containerID="7195f8b84311a94509530fca982c585b812e4ba21c18f5e474167c1e34a5daa8" Mar 19 00:07:31 crc kubenswrapper[4745]: E0319 00:07:31.260172 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:07:32 crc kubenswrapper[4745]: I0319 00:07:32.082102 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:32Z is after 2026-02-23T05:33:13Z Mar 19 00:07:32 crc kubenswrapper[4745]: I0319 00:07:32.263502 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 00:07:32 crc kubenswrapper[4745]: I0319 00:07:32.663574 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:32 crc kubenswrapper[4745]: I0319 00:07:32.663812 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:32 crc kubenswrapper[4745]: I0319 00:07:32.665387 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:32 crc kubenswrapper[4745]: I0319 00:07:32.665432 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:32 crc kubenswrapper[4745]: I0319 00:07:32.665444 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:32 crc kubenswrapper[4745]: I0319 00:07:32.666078 4745 scope.go:117] "RemoveContainer" containerID="7195f8b84311a94509530fca982c585b812e4ba21c18f5e474167c1e34a5daa8" Mar 19 00:07:32 crc kubenswrapper[4745]: E0319 00:07:32.666263 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:07:32 crc kubenswrapper[4745]: I0319 00:07:32.670905 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:33 crc kubenswrapper[4745]: I0319 00:07:33.084356 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:33Z is after 2026-02-23T05:33:13Z Mar 19 00:07:33 crc kubenswrapper[4745]: I0319 00:07:33.273464 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:33 crc kubenswrapper[4745]: I0319 00:07:33.274942 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:33 crc kubenswrapper[4745]: I0319 00:07:33.274997 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:33 crc kubenswrapper[4745]: I0319 00:07:33.275008 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:33 crc kubenswrapper[4745]: I0319 00:07:33.275780 4745 scope.go:117] "RemoveContainer" containerID="7195f8b84311a94509530fca982c585b812e4ba21c18f5e474167c1e34a5daa8" Mar 19 00:07:33 crc kubenswrapper[4745]: E0319 00:07:33.275999 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:07:34 crc kubenswrapper[4745]: I0319 00:07:34.084254 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:34Z is after 2026-02-23T05:33:13Z Mar 19 00:07:34 crc kubenswrapper[4745]: W0319 00:07:34.918409 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:34Z is after 2026-02-23T05:33:13Z Mar 19 00:07:34 crc kubenswrapper[4745]: E0319 00:07:34.918499 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 00:07:35 crc kubenswrapper[4745]: I0319 00:07:35.082111 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:07:35Z is after 2026-02-23T05:33:13Z Mar 19 00:07:35 crc kubenswrapper[4745]: I0319 00:07:35.107802 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:35 crc kubenswrapper[4745]: I0319 00:07:35.108021 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:35 crc kubenswrapper[4745]: I0319 00:07:35.109370 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:35 crc kubenswrapper[4745]: I0319 00:07:35.109410 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:35 crc kubenswrapper[4745]: I0319 00:07:35.109424 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:35 crc kubenswrapper[4745]: I0319 00:07:35.110140 4745 scope.go:117] "RemoveContainer" containerID="7195f8b84311a94509530fca982c585b812e4ba21c18f5e474167c1e34a5daa8" Mar 19 00:07:35 crc kubenswrapper[4745]: E0319 00:07:35.110344 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:07:36 crc kubenswrapper[4745]: I0319 00:07:36.087950 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:36 crc kubenswrapper[4745]: E0319 00:07:36.220962 4745 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 00:07:36 crc kubenswrapper[4745]: I0319 00:07:36.236007 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:36 crc kubenswrapper[4745]: I0319 00:07:36.237350 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:36 crc kubenswrapper[4745]: I0319 00:07:36.237415 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:36 crc kubenswrapper[4745]: I0319 00:07:36.237441 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:36 crc kubenswrapper[4745]: I0319 00:07:36.237487 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:07:36 crc kubenswrapper[4745]: E0319 00:07:36.244705 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 00:07:36 crc kubenswrapper[4745]: E0319 00:07:36.245700 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 00:07:36 crc kubenswrapper[4745]: I0319 00:07:36.788393 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:36 crc kubenswrapper[4745]: I0319 00:07:36.788664 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:36 crc kubenswrapper[4745]: I0319 00:07:36.790383 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:36 crc kubenswrapper[4745]: I0319 00:07:36.790488 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:36 crc kubenswrapper[4745]: I0319 00:07:36.790548 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:36 crc kubenswrapper[4745]: I0319 00:07:36.791588 4745 scope.go:117] "RemoveContainer" containerID="7195f8b84311a94509530fca982c585b812e4ba21c18f5e474167c1e34a5daa8" Mar 19 00:07:36 crc kubenswrapper[4745]: E0319 00:07:36.791975 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:07:37 crc kubenswrapper[4745]: I0319 00:07:37.088138 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:38 crc kubenswrapper[4745]: I0319 00:07:38.086679 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:38 crc kubenswrapper[4745]: I0319 00:07:38.175703 4745 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 00:07:38 crc kubenswrapper[4745]: I0319 00:07:38.194426 4745 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 00:07:39 crc kubenswrapper[4745]: I0319 00:07:39.087418 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:39 crc kubenswrapper[4745]: I0319 00:07:39.274760 4745 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 00:07:39 crc kubenswrapper[4745]: I0319 00:07:39.274925 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.840563 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672afc1300 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.077531904 +0000 UTC m=+0.615727035,LastTimestamp:2026-03-19 00:07:16.077531904 +0000 UTC m=+0.615727035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.848843 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e16fae6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129626854 +0000 UTC m=+0.667821985,LastTimestamp:2026-03-19 00:07:16.129626854 +0000 UTC m=+0.667821985,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.857096 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e176954 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129655124 +0000 UTC m=+0.667850245,LastTimestamp:2026-03-19 00:07:16.129655124 +0000 UTC m=+0.667850245,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.863931 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e17ad2e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129672494 +0000 UTC m=+0.667867625,LastTimestamp:2026-03-19 00:07:16.129672494 +0000 UTC m=+0.667867625,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.871500 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15673348aa82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.216769154 +0000 UTC m=+0.754964285,LastTimestamp:2026-03-19 00:07:16.216769154 +0000 UTC m=+0.754964285,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.874407 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e15672e16fae6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e16fae6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129626854 +0000 UTC m=+0.667821985,LastTimestamp:2026-03-19 00:07:16.238727187 +0000 UTC m=+0.776922318,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.880862 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e15672e176954\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e176954 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129655124 +0000 UTC m=+0.667850245,LastTimestamp:2026-03-19 00:07:16.238743508 +0000 UTC m=+0.776938639,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.887775 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e15672e17ad2e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e17ad2e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129672494 +0000 UTC m=+0.667867625,LastTimestamp:2026-03-19 00:07:16.238754558 +0000 UTC m=+0.776949689,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.894055 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e15672e16fae6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e16fae6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129626854 +0000 UTC m=+0.667821985,LastTimestamp:2026-03-19 00:07:16.240116447 +0000 UTC m=+0.778311578,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.900626 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e15672e176954\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e176954 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129655124 +0000 UTC m=+0.667850245,LastTimestamp:2026-03-19 00:07:16.240131987 +0000 UTC m=+0.778327118,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.907435 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e15672e17ad2e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e17ad2e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129672494 +0000 UTC m=+0.667867625,LastTimestamp:2026-03-19 00:07:16.240139927 +0000 UTC m=+0.778335058,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.915006 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e15672e16fae6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e16fae6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129626854 +0000 UTC m=+0.667821985,LastTimestamp:2026-03-19 00:07:16.240255298 +0000 UTC m=+0.778450419,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.921407 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e15672e176954\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e176954 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129655124 +0000 UTC m=+0.667850245,LastTimestamp:2026-03-19 00:07:16.240271838 +0000 UTC m=+0.778466969,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.927825 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e15672e17ad2e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e17ad2e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129672494 +0000 UTC m=+0.667867625,LastTimestamp:2026-03-19 00:07:16.240284148 +0000 UTC m=+0.778479269,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.934433 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e15672e16fae6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e16fae6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129626854 +0000 UTC m=+0.667821985,LastTimestamp:2026-03-19 00:07:16.240806811 +0000 UTC m=+0.779001942,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.942393 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e15672e176954\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e176954 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129655124 +0000 UTC m=+0.667850245,LastTimestamp:2026-03-19 00:07:16.240819351 +0000 UTC m=+0.779014482,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.949844 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e15672e17ad2e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e17ad2e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129672494 +0000 UTC m=+0.667867625,LastTimestamp:2026-03-19 00:07:16.240827471 +0000 UTC m=+0.779022602,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.957277 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e15672e16fae6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e16fae6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129626854 +0000 UTC m=+0.667821985,LastTimestamp:2026-03-19 00:07:16.241472775 +0000 UTC m=+0.779667906,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: I0319 00:07:39.964718 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.964586 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e15672e176954\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e176954 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129655124 +0000 UTC m=+0.667850245,LastTimestamp:2026-03-19 00:07:16.241488555 +0000 UTC m=+0.779683686,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: I0319 00:07:39.964994 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:39 crc kubenswrapper[4745]: I0319 00:07:39.969491 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:39 crc kubenswrapper[4745]: I0319 00:07:39.969564 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:39 crc kubenswrapper[4745]: I0319 00:07:39.969596 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.974436 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e15672e17ad2e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e17ad2e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129672494 +0000 UTC m=+0.667867625,LastTimestamp:2026-03-19 00:07:16.241496116 +0000 UTC m=+0.779691247,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.978969 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e15672e16fae6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e16fae6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129626854 +0000 UTC m=+0.667821985,LastTimestamp:2026-03-19 00:07:16.243273577 +0000 UTC m=+0.781468698,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: I0319 00:07:39.982176 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.985728 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e15672e176954\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e176954 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129655124 +0000 UTC m=+0.667850245,LastTimestamp:2026-03-19 00:07:16.243293497 +0000 UTC m=+0.781488628,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:39 crc kubenswrapper[4745]: E0319 00:07:39.992911 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e15672e16fae6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e16fae6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129626854 +0000 UTC m=+0.667821985,LastTimestamp:2026-03-19 00:07:16.243314977 +0000 UTC m=+0.781510108,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.000136 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e15672e176954\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e176954 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129655124 +0000 UTC m=+0.667850245,LastTimestamp:2026-03-19 00:07:16.243333507 +0000 UTC m=+0.781528638,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.007720 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e15672e17ad2e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e15672e17ad2e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.129672494 +0000 UTC m=+0.667867625,LastTimestamp:2026-03-19 00:07:16.243346167 +0000 UTC m=+0.781541298,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.016820 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e15674ab4662e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.609705518 +0000 UTC m=+1.147900649,LastTimestamp:2026-03-19 00:07:16.609705518 +0000 UTC m=+1.147900649,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.023112 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e15674ad68c12 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.611943442 +0000 UTC m=+1.150138573,LastTimestamp:2026-03-19 00:07:16.611943442 +0000 UTC m=+1.150138573,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.028814 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e15674afb30f1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.614344945 +0000 UTC m=+1.152540076,LastTimestamp:2026-03-19 00:07:16.614344945 +0000 UTC m=+1.152540076,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.034607 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e15674b7847c8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.622542792 +0000 UTC m=+1.160737923,LastTimestamp:2026-03-19 00:07:16.622542792 +0000 UTC m=+1.160737923,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.040338 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e15674c38f286 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:16.635169414 +0000 UTC m=+1.173364555,LastTimestamp:2026-03-19 00:07:16.635169414 +0000 UTC m=+1.173364555,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.045943 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e15676be9dc0d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.166857229 +0000 UTC m=+1.705052360,LastTimestamp:2026-03-19 00:07:17.166857229 +0000 UTC m=+1.705052360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.051445 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e15676c05dffe openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.168693246 +0000 UTC m=+1.706888377,LastTimestamp:2026-03-19 00:07:17.168693246 +0000 UTC m=+1.706888377,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.058676 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e15676c0b3b7e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.16904435 +0000 UTC m=+1.707239481,LastTimestamp:2026-03-19 00:07:17.16904435 +0000 UTC m=+1.707239481,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.063811 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e15676c11b3c2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.169468354 +0000 UTC m=+1.707663475,LastTimestamp:2026-03-19 00:07:17.169468354 +0000 UTC m=+1.707663475,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.070660 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e15676c200003 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.170405379 +0000 UTC m=+1.708600510,LastTimestamp:2026-03-19 00:07:17.170405379 +0000 UTC m=+1.708600510,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.077720 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e15676cc36523 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.181113635 +0000 UTC m=+1.719308766,LastTimestamp:2026-03-19 00:07:17.181113635 +0000 UTC m=+1.719308766,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: I0319 00:07:40.081982 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.081859 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e15676cd9df9d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.182586781 +0000 UTC m=+1.720781912,LastTimestamp:2026-03-19 00:07:17.182586781 +0000 UTC m=+1.720781912,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.085615 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e15676cdb5992 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.182683538 +0000 UTC m=+1.720878669,LastTimestamp:2026-03-19 00:07:17.182683538 +0000 UTC m=+1.720878669,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.092136 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e15676cef6c8a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.183999114 +0000 UTC m=+1.722194245,LastTimestamp:2026-03-19 00:07:17.183999114 +0000 UTC m=+1.722194245,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: W0319 00:07:40.092266 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.092318 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.099691 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e15676cf4b130 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.184344368 +0000 UTC m=+1.722539499,LastTimestamp:2026-03-19 00:07:17.184344368 +0000 UTC m=+1.722539499,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.106553 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e15676cf4fcec openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.184363756 +0000 UTC m=+1.722558887,LastTimestamp:2026-03-19 00:07:17.184363756 +0000 UTC m=+1.722558887,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.113810 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e15677ed785c3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.484422595 +0000 UTC m=+2.022617726,LastTimestamp:2026-03-19 00:07:17.484422595 +0000 UTC m=+2.022617726,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.119751 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e15677f9d6f78 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.497393016 +0000 UTC m=+2.035588157,LastTimestamp:2026-03-19 00:07:17.497393016 +0000 UTC m=+2.035588157,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.127642 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e15677faf6299 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.498569369 +0000 UTC m=+2.036764500,LastTimestamp:2026-03-19 00:07:17.498569369 +0000 UTC m=+2.036764500,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.133933 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e15678bc7a986 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.701486982 +0000 UTC m=+2.239682103,LastTimestamp:2026-03-19 00:07:17.701486982 +0000 UTC m=+2.239682103,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.140104 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e15678c52e265 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.710611045 +0000 UTC m=+2.248806176,LastTimestamp:2026-03-19 00:07:17.710611045 +0000 UTC m=+2.248806176,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.145363 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e15678c73a726 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.712758566 +0000 UTC m=+2.250953707,LastTimestamp:2026-03-19 00:07:17.712758566 +0000 UTC m=+2.250953707,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.152851 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e1567992dd529 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.926286633 +0000 UTC m=+2.464481764,LastTimestamp:2026-03-19 00:07:17.926286633 +0000 UTC m=+2.464481764,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.157373 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e15679a5aeb3e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.946018622 +0000 UTC m=+2.484213753,LastTimestamp:2026-03-19 00:07:17.946018622 +0000 UTC m=+2.484213753,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.162504 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e1567a73f23f0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.162301936 +0000 UTC m=+2.700497087,LastTimestamp:2026-03-19 00:07:18.162301936 +0000 UTC m=+2.700497087,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.169810 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e1567a778e654 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.166087252 +0000 UTC m=+2.704282383,LastTimestamp:2026-03-19 00:07:18.166087252 +0000 UTC m=+2.704282383,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: W0319 00:07:40.170486 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.170566 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.178362 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e1567a814d64e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.176306766 +0000 UTC m=+2.714501897,LastTimestamp:2026-03-19 00:07:18.176306766 +0000 UTC m=+2.714501897,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.183812 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567a8159c8d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.176357517 +0000 UTC m=+2.714552688,LastTimestamp:2026-03-19 00:07:18.176357517 +0000 UTC m=+2.714552688,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.185839 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e1567b6685713 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.416660243 +0000 UTC m=+2.954855364,LastTimestamp:2026-03-19 00:07:18.416660243 +0000 UTC m=+2.954855364,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.193261 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e1567b683aa00 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.418450944 +0000 UTC m=+2.956646095,LastTimestamp:2026-03-19 00:07:18.418450944 +0000 UTC m=+2.956646095,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.200146 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e1567b95f90fe openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.466416894 +0000 UTC m=+3.004612025,LastTimestamp:2026-03-19 00:07:18.466416894 +0000 UTC m=+3.004612025,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.204696 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e1567b9c1d089 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.472855689 +0000 UTC m=+3.011050820,LastTimestamp:2026-03-19 00:07:18.472855689 +0000 UTC m=+3.011050820,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.212549 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567b9c5972a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.473103146 +0000 UTC m=+3.011298277,LastTimestamp:2026-03-19 00:07:18.473103146 +0000 UTC m=+3.011298277,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.220757 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e1567b9d830ab openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.474322091 +0000 UTC m=+3.012517222,LastTimestamp:2026-03-19 00:07:18.474322091 +0000 UTC m=+3.012517222,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.228487 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e1567ba6d935c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.48411222 +0000 UTC m=+3.022307351,LastTimestamp:2026-03-19 00:07:18.48411222 +0000 UTC m=+3.022307351,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.236694 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567bc66367d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.517184125 +0000 UTC m=+3.055379246,LastTimestamp:2026-03-19 00:07:18.517184125 +0000 UTC m=+3.055379246,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.244500 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567bc7a0c2c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.518484012 +0000 UTC m=+3.056679133,LastTimestamp:2026-03-19 00:07:18.518484012 +0000 UTC m=+3.056679133,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.252544 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e1567c44a4b51 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.649572177 +0000 UTC m=+3.187767308,LastTimestamp:2026-03-19 00:07:18.649572177 +0000 UTC m=+3.187767308,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.259135 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e1567c50390ea openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.661714154 +0000 UTC m=+3.199909285,LastTimestamp:2026-03-19 00:07:18.661714154 +0000 UTC m=+3.199909285,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.267656 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e1567c51bec49 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.663310409 +0000 UTC m=+3.201505540,LastTimestamp:2026-03-19 00:07:18.663310409 +0000 UTC m=+3.201505540,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.275527 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567c9060c05 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.728985605 +0000 UTC m=+3.267180736,LastTimestamp:2026-03-19 00:07:18.728985605 +0000 UTC m=+3.267180736,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.282518 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567c9ac6c4b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.739889227 +0000 UTC m=+3.278084348,LastTimestamp:2026-03-19 00:07:18.739889227 +0000 UTC m=+3.278084348,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.291076 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567c9d5cb6a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.742600554 +0000 UTC m=+3.280795685,LastTimestamp:2026-03-19 00:07:18.742600554 +0000 UTC m=+3.280795685,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.294659 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e1567d031c0ac openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.849290412 +0000 UTC m=+3.387485563,LastTimestamp:2026-03-19 00:07:18.849290412 +0000 UTC m=+3.387485563,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.298468 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e1567d11fea7f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.864898687 +0000 UTC m=+3.403093828,LastTimestamp:2026-03-19 00:07:18.864898687 +0000 UTC m=+3.403093828,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: I0319 00:07:40.299443 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:40 crc kubenswrapper[4745]: I0319 00:07:40.301513 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:40 crc kubenswrapper[4745]: I0319 00:07:40.301570 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:40 crc kubenswrapper[4745]: I0319 00:07:40.301587 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.306808 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567d5e4ccba openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.944910522 +0000 UTC m=+3.483105653,LastTimestamp:2026-03-19 00:07:18.944910522 +0000 UTC m=+3.483105653,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.313813 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e1567d5f638d1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.946052305 +0000 UTC m=+3.484247436,LastTimestamp:2026-03-19 00:07:18.946052305 +0000 UTC m=+3.484247436,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.321115 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567d6991cd6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.95672751 +0000 UTC m=+3.494922641,LastTimestamp:2026-03-19 00:07:18.95672751 +0000 UTC m=+3.494922641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.327689 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567d6accfdf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:18.958018527 +0000 UTC m=+3.496213658,LastTimestamp:2026-03-19 00:07:18.958018527 +0000 UTC m=+3.496213658,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.333074 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567e22493a3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:19.150416803 +0000 UTC m=+3.688611934,LastTimestamp:2026-03-19 00:07:19.150416803 +0000 UTC m=+3.688611934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.336822 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567e38afb8e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:19.173905294 +0000 UTC m=+3.712100435,LastTimestamp:2026-03-19 00:07:19.173905294 +0000 UTC m=+3.712100435,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.342999 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567e3ab77a7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:19.176034215 +0000 UTC m=+3.714229346,LastTimestamp:2026-03-19 00:07:19.176034215 +0000 UTC m=+3.714229346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.350968 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e1567e4e179f5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:19.196350965 +0000 UTC m=+3.734546096,LastTimestamp:2026-03-19 00:07:19.196350965 +0000 UTC m=+3.734546096,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.358511 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567eec7ed77 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:19.362448759 +0000 UTC m=+3.900643890,LastTimestamp:2026-03-19 00:07:19.362448759 +0000 UTC m=+3.900643890,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.365251 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567ef83aaeb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:19.374752491 +0000 UTC m=+3.912947622,LastTimestamp:2026-03-19 00:07:19.374752491 +0000 UTC m=+3.912947622,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.372391 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e1567f2232ec2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:19.418760898 +0000 UTC m=+3.956956029,LastTimestamp:2026-03-19 00:07:19.418760898 +0000 UTC m=+3.956956029,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.378787 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e1567f3430e60 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:19.437626976 +0000 UTC m=+3.975822107,LastTimestamp:2026-03-19 00:07:19.437626976 +0000 UTC m=+3.975822107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.386220 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e1568212f78f3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:20.208095475 +0000 UTC m=+4.746290606,LastTimestamp:2026-03-19 00:07:20.208095475 +0000 UTC m=+4.746290606,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.393121 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e15682c31de4f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:20.392801871 +0000 UTC m=+4.930997002,LastTimestamp:2026-03-19 00:07:20.392801871 +0000 UTC m=+4.930997002,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.399500 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e15682cbf16e7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:20.402056935 +0000 UTC m=+4.940252096,LastTimestamp:2026-03-19 00:07:20.402056935 +0000 UTC m=+4.940252096,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.407083 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e15682cd2c37e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:20.403346302 +0000 UTC m=+4.941541473,LastTimestamp:2026-03-19 00:07:20.403346302 +0000 UTC m=+4.941541473,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.414700 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e15683b24f860 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:20.643614816 +0000 UTC m=+5.181809977,LastTimestamp:2026-03-19 00:07:20.643614816 +0000 UTC m=+5.181809977,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.420957 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e15683c2052e6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:20.660087526 +0000 UTC m=+5.198282697,LastTimestamp:2026-03-19 00:07:20.660087526 +0000 UTC m=+5.198282697,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.425270 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e15683c34a898 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:20.661420184 +0000 UTC m=+5.199615315,LastTimestamp:2026-03-19 00:07:20.661420184 +0000 UTC m=+5.199615315,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.431976 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e156849929c44 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:20.88568122 +0000 UTC m=+5.423876351,LastTimestamp:2026-03-19 00:07:20.88568122 +0000 UTC m=+5.423876351,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.438827 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e15684a3dbd0e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:20.89689627 +0000 UTC m=+5.435091411,LastTimestamp:2026-03-19 00:07:20.89689627 +0000 UTC m=+5.435091411,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.445241 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e15684a4c9a2a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:20.897870378 +0000 UTC m=+5.436065509,LastTimestamp:2026-03-19 00:07:20.897870378 +0000 UTC m=+5.436065509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.451432 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e15685715b469 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:21.112376425 +0000 UTC m=+5.650571556,LastTimestamp:2026-03-19 00:07:21.112376425 +0000 UTC m=+5.650571556,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.456092 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e156857d3f200 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:21.124844032 +0000 UTC m=+5.663039163,LastTimestamp:2026-03-19 00:07:21.124844032 +0000 UTC m=+5.663039163,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.460270 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e156857eb6da7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:21.126383015 +0000 UTC m=+5.664578146,LastTimestamp:2026-03-19 00:07:21.126383015 +0000 UTC m=+5.664578146,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.467076 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e1568625f748b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:21.301759115 +0000 UTC m=+5.839954256,LastTimestamp:2026-03-19 00:07:21.301759115 +0000 UTC m=+5.839954256,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.471084 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e1568630c7c6b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:21.313098859 +0000 UTC m=+5.851293980,LastTimestamp:2026-03-19 00:07:21.313098859 +0000 UTC m=+5.851293980,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.481531 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 00:07:40 crc kubenswrapper[4745]: &Event{ObjectMeta:{kube-controller-manager-crc.189e156a3d8a891f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 19 00:07:40 crc kubenswrapper[4745]: body: Mar 19 00:07:40 crc kubenswrapper[4745]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:29.273760031 +0000 UTC m=+13.811955152,LastTimestamp:2026-03-19 00:07:29.273760031 +0000 UTC m=+13.811955152,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 00:07:40 crc kubenswrapper[4745]: > Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.488198 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e156a3d8b85d3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:29.273824723 +0000 UTC m=+13.812019854,LastTimestamp:2026-03-19 00:07:29.273824723 +0000 UTC m=+13.812019854,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.496573 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 00:07:40 crc kubenswrapper[4745]: &Event{ObjectMeta:{kube-apiserver-crc.189e156a5f1e841b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 19 00:07:40 crc kubenswrapper[4745]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 00:07:40 crc kubenswrapper[4745]: Mar 19 00:07:40 crc kubenswrapper[4745]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:29.837106203 +0000 UTC m=+14.375301334,LastTimestamp:2026-03-19 00:07:29.837106203 +0000 UTC m=+14.375301334,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 00:07:40 crc kubenswrapper[4745]: > Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.503449 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e156a5f1f6243 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:29.837163075 +0000 UTC m=+14.375358226,LastTimestamp:2026-03-19 00:07:29.837163075 +0000 UTC m=+14.375358226,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.510766 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e156a5f1e841b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 00:07:40 crc kubenswrapper[4745]: &Event{ObjectMeta:{kube-apiserver-crc.189e156a5f1e841b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 19 00:07:40 crc kubenswrapper[4745]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 00:07:40 crc kubenswrapper[4745]: Mar 19 00:07:40 crc kubenswrapper[4745]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:29.837106203 +0000 UTC m=+14.375301334,LastTimestamp:2026-03-19 00:07:29.84328673 +0000 UTC m=+14.381481861,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 00:07:40 crc kubenswrapper[4745]: > Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.519269 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e156a5f1f6243\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e156a5f1f6243 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:29.837163075 +0000 UTC m=+14.375358226,LastTimestamp:2026-03-19 00:07:29.843333951 +0000 UTC m=+14.381529072,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.528799 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e1567e3ab77a7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567e3ab77a7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:19.176034215 +0000 UTC m=+3.714229346,LastTimestamp:2026-03-19 00:07:30.248836154 +0000 UTC m=+14.787031305,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.535939 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e1567eec7ed77\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567eec7ed77 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:19.362448759 +0000 UTC m=+3.900643890,LastTimestamp:2026-03-19 00:07:30.510067505 +0000 UTC m=+15.048262646,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.542919 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e1567ef83aaeb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567ef83aaeb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:19.374752491 +0000 UTC m=+3.912947622,LastTimestamp:2026-03-19 00:07:30.523764296 +0000 UTC m=+15.061959467,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.552171 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 00:07:40 crc kubenswrapper[4745]: &Event{ObjectMeta:{kube-controller-manager-crc.189e156c91a747d6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 00:07:40 crc kubenswrapper[4745]: body: Mar 19 00:07:40 crc kubenswrapper[4745]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:39.274864598 +0000 UTC m=+23.813059769,LastTimestamp:2026-03-19 00:07:39.274864598 +0000 UTC m=+23.813059769,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 00:07:40 crc kubenswrapper[4745]: > Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.559308 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e156c91a8f5da openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:39.274974682 +0000 UTC m=+23.813169843,LastTimestamp:2026-03-19 00:07:39.274974682 +0000 UTC m=+23.813169843,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:41 crc kubenswrapper[4745]: I0319 00:07:41.086838 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:41 crc kubenswrapper[4745]: W0319 00:07:41.246390 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 19 00:07:41 crc kubenswrapper[4745]: E0319 00:07:41.246483 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 00:07:42 crc kubenswrapper[4745]: I0319 00:07:42.084415 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:43 crc kubenswrapper[4745]: I0319 00:07:43.086648 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:43 crc kubenswrapper[4745]: I0319 00:07:43.245932 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:43 crc kubenswrapper[4745]: I0319 00:07:43.249684 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:43 crc kubenswrapper[4745]: I0319 00:07:43.249872 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:43 crc kubenswrapper[4745]: I0319 00:07:43.249964 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:43 crc kubenswrapper[4745]: I0319 00:07:43.250143 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:07:43 crc kubenswrapper[4745]: E0319 00:07:43.256593 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 00:07:43 crc kubenswrapper[4745]: E0319 00:07:43.257414 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 00:07:43 crc kubenswrapper[4745]: W0319 00:07:43.479386 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:43 crc kubenswrapper[4745]: E0319 00:07:43.479459 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 00:07:44 crc kubenswrapper[4745]: I0319 00:07:44.088184 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:45 crc kubenswrapper[4745]: I0319 00:07:45.087088 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:46 crc kubenswrapper[4745]: I0319 00:07:46.085969 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:46 crc kubenswrapper[4745]: E0319 00:07:46.221320 4745 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 00:07:47 crc kubenswrapper[4745]: I0319 00:07:47.087283 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:48 crc kubenswrapper[4745]: I0319 00:07:48.083585 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:48 crc kubenswrapper[4745]: I0319 00:07:48.455503 4745 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:38688->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 19 00:07:48 crc kubenswrapper[4745]: I0319 00:07:48.455605 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:38688->192.168.126.11:10357: read: connection reset by peer" Mar 19 00:07:48 crc kubenswrapper[4745]: I0319 00:07:48.455694 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:48 crc kubenswrapper[4745]: I0319 00:07:48.456135 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:48 crc kubenswrapper[4745]: I0319 00:07:48.458503 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:48 crc kubenswrapper[4745]: I0319 00:07:48.458580 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:48 crc kubenswrapper[4745]: I0319 00:07:48.458605 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:48 crc kubenswrapper[4745]: I0319 00:07:48.461446 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 19 00:07:48 crc kubenswrapper[4745]: E0319 00:07:48.462924 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 00:07:48 crc kubenswrapper[4745]: &Event{ObjectMeta:{kube-controller-manager-crc.189e156eb4ddd2c6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:38688->192.168.126.11:10357: read: connection reset by peer Mar 19 00:07:48 crc kubenswrapper[4745]: body: Mar 19 00:07:48 crc kubenswrapper[4745]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:48.455576262 +0000 UTC m=+32.993771433,LastTimestamp:2026-03-19 00:07:48.455576262 +0000 UTC m=+32.993771433,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 00:07:48 crc kubenswrapper[4745]: > Mar 19 00:07:48 crc kubenswrapper[4745]: I0319 00:07:48.463008 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1" gracePeriod=30 Mar 19 00:07:48 crc kubenswrapper[4745]: E0319 00:07:48.472387 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e156eb4dedce0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:38688->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:48.455644384 +0000 UTC m=+32.993839555,LastTimestamp:2026-03-19 00:07:48.455644384 +0000 UTC m=+32.993839555,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:48 crc kubenswrapper[4745]: E0319 00:07:48.481989 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e156eb54ee36e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:48.462986094 +0000 UTC m=+33.001181295,LastTimestamp:2026-03-19 00:07:48.462986094 +0000 UTC m=+33.001181295,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:48 crc kubenswrapper[4745]: E0319 00:07:48.496867 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e15676cef6c8a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e15676cef6c8a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.183999114 +0000 UTC m=+1.722194245,LastTimestamp:2026-03-19 00:07:48.490004045 +0000 UTC m=+33.028199206,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:48 crc kubenswrapper[4745]: E0319 00:07:48.750726 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e15677ed785c3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e15677ed785c3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.484422595 +0000 UTC m=+2.022617726,LastTimestamp:2026-03-19 00:07:48.74397603 +0000 UTC m=+33.282171211,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:48 crc kubenswrapper[4745]: E0319 00:07:48.767850 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e15677f9d6f78\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e15677f9d6f78 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.497393016 +0000 UTC m=+2.035588157,LastTimestamp:2026-03-19 00:07:48.759577536 +0000 UTC m=+33.297772677,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.088416 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.137189 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.138987 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.139057 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.139073 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.139899 4745 scope.go:117] "RemoveContainer" containerID="7195f8b84311a94509530fca982c585b812e4ba21c18f5e474167c1e34a5daa8" Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.327538 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.327935 4745 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1" exitCode=255 Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.327994 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1"} Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.328034 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c"} Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.328144 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.329330 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.329365 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.329378 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.085249 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.257602 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.259471 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.259540 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.259571 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.259628 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:07:50 crc kubenswrapper[4745]: E0319 00:07:50.263704 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 00:07:50 crc kubenswrapper[4745]: E0319 00:07:50.264207 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.333501 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.334063 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.336412 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dce1182a8dd8ad8765f208ca1007c4e687632c9584acacbeaa44bd9ace9e60f9" exitCode=255 Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.336452 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dce1182a8dd8ad8765f208ca1007c4e687632c9584acacbeaa44bd9ace9e60f9"} Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.336494 4745 scope.go:117] "RemoveContainer" containerID="7195f8b84311a94509530fca982c585b812e4ba21c18f5e474167c1e34a5daa8" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.336727 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.337959 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.338007 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.338022 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.338824 4745 scope.go:117] "RemoveContainer" containerID="dce1182a8dd8ad8765f208ca1007c4e687632c9584acacbeaa44bd9ace9e60f9" Mar 19 00:07:50 crc kubenswrapper[4745]: E0319 00:07:50.339120 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:07:51 crc kubenswrapper[4745]: I0319 00:07:51.083666 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:51 crc kubenswrapper[4745]: I0319 00:07:51.342135 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 00:07:51 crc kubenswrapper[4745]: I0319 00:07:51.534591 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:51 crc kubenswrapper[4745]: I0319 00:07:51.534819 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:51 crc kubenswrapper[4745]: I0319 00:07:51.536249 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:51 crc kubenswrapper[4745]: I0319 00:07:51.536299 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:51 crc kubenswrapper[4745]: I0319 00:07:51.536349 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:52 crc kubenswrapper[4745]: I0319 00:07:52.085296 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:53 crc kubenswrapper[4745]: I0319 00:07:53.087815 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:54 crc kubenswrapper[4745]: I0319 00:07:54.083361 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:55 crc kubenswrapper[4745]: I0319 00:07:55.084364 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:55 crc kubenswrapper[4745]: I0319 00:07:55.108579 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:55 crc kubenswrapper[4745]: I0319 00:07:55.108829 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:55 crc kubenswrapper[4745]: I0319 00:07:55.110243 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:55 crc kubenswrapper[4745]: I0319 00:07:55.110290 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:55 crc kubenswrapper[4745]: I0319 00:07:55.110306 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:55 crc kubenswrapper[4745]: I0319 00:07:55.111103 4745 scope.go:117] "RemoveContainer" containerID="dce1182a8dd8ad8765f208ca1007c4e687632c9584acacbeaa44bd9ace9e60f9" Mar 19 00:07:55 crc kubenswrapper[4745]: E0319 00:07:55.111369 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.085399 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:56 crc kubenswrapper[4745]: E0319 00:07:56.222149 4745 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.274456 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.274721 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.277502 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.277560 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.277576 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.282267 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.358475 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.359571 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.359624 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.359643 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.788212 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.788411 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.789734 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.789782 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.789795 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.790525 4745 scope.go:117] "RemoveContainer" containerID="dce1182a8dd8ad8765f208ca1007c4e687632c9584acacbeaa44bd9ace9e60f9" Mar 19 00:07:56 crc kubenswrapper[4745]: E0319 00:07:56.790734 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:07:57 crc kubenswrapper[4745]: I0319 00:07:57.086141 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:57 crc kubenswrapper[4745]: I0319 00:07:57.263846 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:57 crc kubenswrapper[4745]: I0319 00:07:57.265417 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:57 crc kubenswrapper[4745]: I0319 00:07:57.265463 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:57 crc kubenswrapper[4745]: I0319 00:07:57.265480 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:57 crc kubenswrapper[4745]: I0319 00:07:57.265520 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:07:57 crc kubenswrapper[4745]: E0319 00:07:57.269524 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 00:07:57 crc kubenswrapper[4745]: E0319 00:07:57.271321 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 00:07:57 crc kubenswrapper[4745]: W0319 00:07:57.722715 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 19 00:07:57 crc kubenswrapper[4745]: E0319 00:07:57.722776 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 00:07:58 crc kubenswrapper[4745]: I0319 00:07:58.087360 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:59 crc kubenswrapper[4745]: I0319 00:07:59.086135 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:59 crc kubenswrapper[4745]: W0319 00:07:59.801722 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 19 00:07:59 crc kubenswrapper[4745]: E0319 00:07:59.801784 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 00:08:00 crc kubenswrapper[4745]: I0319 00:08:00.087306 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:01 crc kubenswrapper[4745]: I0319 00:08:01.087226 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:01 crc kubenswrapper[4745]: I0319 00:08:01.542002 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:08:01 crc kubenswrapper[4745]: I0319 00:08:01.542294 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:01 crc kubenswrapper[4745]: I0319 00:08:01.544026 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:01 crc kubenswrapper[4745]: I0319 00:08:01.544101 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:01 crc kubenswrapper[4745]: I0319 00:08:01.544131 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:02 crc kubenswrapper[4745]: I0319 00:08:02.083492 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:03 crc kubenswrapper[4745]: I0319 00:08:03.086248 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:03 crc kubenswrapper[4745]: W0319 00:08:03.891505 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 19 00:08:03 crc kubenswrapper[4745]: E0319 00:08:03.892048 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 00:08:04 crc kubenswrapper[4745]: I0319 00:08:04.087415 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:04 crc kubenswrapper[4745]: I0319 00:08:04.270378 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:04 crc kubenswrapper[4745]: I0319 00:08:04.272284 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:04 crc kubenswrapper[4745]: I0319 00:08:04.272547 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:04 crc kubenswrapper[4745]: I0319 00:08:04.272791 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:04 crc kubenswrapper[4745]: I0319 00:08:04.273060 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:08:04 crc kubenswrapper[4745]: E0319 00:08:04.277509 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 00:08:04 crc kubenswrapper[4745]: E0319 00:08:04.277572 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 00:08:05 crc kubenswrapper[4745]: I0319 00:08:05.086696 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:06 crc kubenswrapper[4745]: I0319 00:08:06.085233 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:06 crc kubenswrapper[4745]: E0319 00:08:06.223488 4745 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 00:08:06 crc kubenswrapper[4745]: I0319 00:08:06.553661 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 00:08:06 crc kubenswrapper[4745]: I0319 00:08:06.553946 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:06 crc kubenswrapper[4745]: I0319 00:08:06.555730 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:06 crc kubenswrapper[4745]: I0319 00:08:06.555799 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:06 crc kubenswrapper[4745]: I0319 00:08:06.555816 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:07 crc kubenswrapper[4745]: I0319 00:08:07.089099 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:07 crc kubenswrapper[4745]: W0319 00:08:07.284274 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:07 crc kubenswrapper[4745]: E0319 00:08:07.284337 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 00:08:08 crc kubenswrapper[4745]: I0319 00:08:08.085814 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:09 crc kubenswrapper[4745]: I0319 00:08:09.086799 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:09 crc kubenswrapper[4745]: I0319 00:08:09.137461 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:09 crc kubenswrapper[4745]: I0319 00:08:09.138838 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:09 crc kubenswrapper[4745]: I0319 00:08:09.138908 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:09 crc kubenswrapper[4745]: I0319 00:08:09.138931 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:09 crc kubenswrapper[4745]: I0319 00:08:09.139694 4745 scope.go:117] "RemoveContainer" containerID="dce1182a8dd8ad8765f208ca1007c4e687632c9584acacbeaa44bd9ace9e60f9" Mar 19 00:08:09 crc kubenswrapper[4745]: E0319 00:08:09.139974 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:08:10 crc kubenswrapper[4745]: I0319 00:08:10.085017 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:11 crc kubenswrapper[4745]: I0319 00:08:11.084932 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:11 crc kubenswrapper[4745]: I0319 00:08:11.278111 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:11 crc kubenswrapper[4745]: I0319 00:08:11.280276 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:11 crc kubenswrapper[4745]: I0319 00:08:11.280352 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:11 crc kubenswrapper[4745]: I0319 00:08:11.280365 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:11 crc kubenswrapper[4745]: I0319 00:08:11.280392 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:08:11 crc kubenswrapper[4745]: E0319 00:08:11.284833 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 00:08:11 crc kubenswrapper[4745]: E0319 00:08:11.286722 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 00:08:12 crc kubenswrapper[4745]: I0319 00:08:12.084092 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:13 crc kubenswrapper[4745]: I0319 00:08:13.086588 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:14 crc kubenswrapper[4745]: I0319 00:08:14.086112 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:15 crc kubenswrapper[4745]: I0319 00:08:15.085224 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:16 crc kubenswrapper[4745]: I0319 00:08:16.085796 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:16 crc kubenswrapper[4745]: E0319 00:08:16.223777 4745 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 00:08:17 crc kubenswrapper[4745]: I0319 00:08:17.085405 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:18 crc kubenswrapper[4745]: I0319 00:08:18.086786 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:18 crc kubenswrapper[4745]: I0319 00:08:18.285911 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:18 crc kubenswrapper[4745]: I0319 00:08:18.289614 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:18 crc kubenswrapper[4745]: I0319 00:08:18.289764 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:18 crc kubenswrapper[4745]: I0319 00:08:18.289856 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:18 crc kubenswrapper[4745]: I0319 00:08:18.290008 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:08:18 crc kubenswrapper[4745]: E0319 00:08:18.295222 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 00:08:18 crc kubenswrapper[4745]: E0319 00:08:18.295774 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 00:08:19 crc kubenswrapper[4745]: I0319 00:08:19.085051 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:20 crc kubenswrapper[4745]: I0319 00:08:20.101968 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:20 crc kubenswrapper[4745]: I0319 00:08:20.817995 4745 csr.go:261] certificate signing request csr-lh9q4 is approved, waiting to be issued Mar 19 00:08:20 crc kubenswrapper[4745]: I0319 00:08:20.829750 4745 csr.go:257] certificate signing request csr-lh9q4 is issued Mar 19 00:08:20 crc kubenswrapper[4745]: I0319 00:08:20.860560 4745 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 19 00:08:20 crc kubenswrapper[4745]: I0319 00:08:20.925282 4745 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 19 00:08:21 crc kubenswrapper[4745]: I0319 00:08:21.830990 4745 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-24 01:51:09.454847496 +0000 UTC Mar 19 00:08:21 crc kubenswrapper[4745]: I0319 00:08:21.831091 4745 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6721h42m47.623762766s for next certificate rotation Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.137143 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.138942 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.139020 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.139039 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.140126 4745 scope.go:117] "RemoveContainer" containerID="dce1182a8dd8ad8765f208ca1007c4e687632c9584acacbeaa44bd9ace9e60f9" Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.500748 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.502781 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837"} Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.502968 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.503854 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.503912 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.503928 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:23 crc kubenswrapper[4745]: I0319 00:08:23.509717 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 00:08:23 crc kubenswrapper[4745]: I0319 00:08:23.510693 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 00:08:23 crc kubenswrapper[4745]: I0319 00:08:23.514376 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837" exitCode=255 Mar 19 00:08:23 crc kubenswrapper[4745]: I0319 00:08:23.514442 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837"} Mar 19 00:08:23 crc kubenswrapper[4745]: I0319 00:08:23.514501 4745 scope.go:117] "RemoveContainer" containerID="dce1182a8dd8ad8765f208ca1007c4e687632c9584acacbeaa44bd9ace9e60f9" Mar 19 00:08:23 crc kubenswrapper[4745]: I0319 00:08:23.514803 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:23 crc kubenswrapper[4745]: I0319 00:08:23.516646 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:23 crc kubenswrapper[4745]: I0319 00:08:23.516701 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:23 crc kubenswrapper[4745]: I0319 00:08:23.516724 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:23 crc kubenswrapper[4745]: I0319 00:08:23.517771 4745 scope.go:117] "RemoveContainer" containerID="a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837" Mar 19 00:08:23 crc kubenswrapper[4745]: E0319 00:08:23.518091 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:08:24 crc kubenswrapper[4745]: I0319 00:08:24.519338 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.108460 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.108723 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.110327 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.110402 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.110424 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.111551 4745 scope.go:117] "RemoveContainer" containerID="a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837" Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.111988 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.295791 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.297392 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.297616 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.297718 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.298028 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.307855 4745 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.308389 4745 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.308436 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.311943 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.312019 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.312040 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.312065 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.312082 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:25Z","lastTransitionTime":"2026-03-19T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.328024 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.335400 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.335459 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.335482 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.335514 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.335533 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:25Z","lastTransitionTime":"2026-03-19T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.347457 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.357658 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.357740 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.357761 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.357792 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.357818 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:25Z","lastTransitionTime":"2026-03-19T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.370931 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.381124 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.381201 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.381229 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.381258 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.381279 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:25Z","lastTransitionTime":"2026-03-19T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.393453 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.393599 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.393626 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.493961 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.507854 4745 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.594457 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.694803 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.795044 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.895320 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.995643 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.096192 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.196618 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.225028 4745 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.297550 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.398045 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.498183 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.599336 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.699971 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:26 crc kubenswrapper[4745]: I0319 00:08:26.789226 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:08:26 crc kubenswrapper[4745]: I0319 00:08:26.789395 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:26 crc kubenswrapper[4745]: I0319 00:08:26.791070 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:26 crc kubenswrapper[4745]: I0319 00:08:26.791218 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:26 crc kubenswrapper[4745]: I0319 00:08:26.791331 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:26 crc kubenswrapper[4745]: I0319 00:08:26.792813 4745 scope.go:117] "RemoveContainer" containerID="a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.793336 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.801006 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.902192 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:27 crc kubenswrapper[4745]: E0319 00:08:27.003311 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:27 crc kubenswrapper[4745]: E0319 00:08:27.103683 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:27 crc kubenswrapper[4745]: E0319 00:08:27.204374 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:27 crc kubenswrapper[4745]: E0319 00:08:27.305131 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:27 crc kubenswrapper[4745]: E0319 00:08:27.405919 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:27 crc kubenswrapper[4745]: E0319 00:08:27.506042 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:27 crc kubenswrapper[4745]: E0319 00:08:27.606435 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:27 crc kubenswrapper[4745]: E0319 00:08:27.708065 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:27 crc kubenswrapper[4745]: E0319 00:08:27.808572 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:27 crc kubenswrapper[4745]: E0319 00:08:27.909609 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:28 crc kubenswrapper[4745]: E0319 00:08:28.010420 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:28 crc kubenswrapper[4745]: E0319 00:08:28.111114 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:28 crc kubenswrapper[4745]: E0319 00:08:28.211909 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:28 crc kubenswrapper[4745]: E0319 00:08:28.312190 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:28 crc kubenswrapper[4745]: E0319 00:08:28.413212 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:28 crc kubenswrapper[4745]: E0319 00:08:28.514255 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:28 crc kubenswrapper[4745]: E0319 00:08:28.615045 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:28 crc kubenswrapper[4745]: E0319 00:08:28.716066 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:28 crc kubenswrapper[4745]: E0319 00:08:28.816244 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:28 crc kubenswrapper[4745]: E0319 00:08:28.917026 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:29 crc kubenswrapper[4745]: E0319 00:08:29.017686 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:29 crc kubenswrapper[4745]: E0319 00:08:29.118916 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:29 crc kubenswrapper[4745]: E0319 00:08:29.219967 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:29 crc kubenswrapper[4745]: E0319 00:08:29.320361 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:29 crc kubenswrapper[4745]: E0319 00:08:29.420731 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:29 crc kubenswrapper[4745]: E0319 00:08:29.521757 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:29 crc kubenswrapper[4745]: E0319 00:08:29.622370 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:29 crc kubenswrapper[4745]: E0319 00:08:29.722489 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:29 crc kubenswrapper[4745]: E0319 00:08:29.822903 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:29 crc kubenswrapper[4745]: E0319 00:08:29.923845 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:30 crc kubenswrapper[4745]: E0319 00:08:30.024982 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:30 crc kubenswrapper[4745]: E0319 00:08:30.125796 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:30 crc kubenswrapper[4745]: E0319 00:08:30.226705 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:30 crc kubenswrapper[4745]: E0319 00:08:30.327832 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:30 crc kubenswrapper[4745]: E0319 00:08:30.428939 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:30 crc kubenswrapper[4745]: E0319 00:08:30.529595 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:30 crc kubenswrapper[4745]: E0319 00:08:30.629962 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:30 crc kubenswrapper[4745]: E0319 00:08:30.730308 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:30 crc kubenswrapper[4745]: E0319 00:08:30.830931 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:30 crc kubenswrapper[4745]: E0319 00:08:30.931552 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:31 crc kubenswrapper[4745]: E0319 00:08:31.032570 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:31 crc kubenswrapper[4745]: E0319 00:08:31.133209 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:31 crc kubenswrapper[4745]: E0319 00:08:31.233457 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:31 crc kubenswrapper[4745]: E0319 00:08:31.334548 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:31 crc kubenswrapper[4745]: E0319 00:08:31.434951 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:31 crc kubenswrapper[4745]: E0319 00:08:31.535030 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:31 crc kubenswrapper[4745]: E0319 00:08:31.635830 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:31 crc kubenswrapper[4745]: E0319 00:08:31.736442 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:31 crc kubenswrapper[4745]: E0319 00:08:31.836908 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:31 crc kubenswrapper[4745]: E0319 00:08:31.937271 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:32 crc kubenswrapper[4745]: E0319 00:08:32.037475 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:32 crc kubenswrapper[4745]: E0319 00:08:32.138563 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:32 crc kubenswrapper[4745]: E0319 00:08:32.239665 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:32 crc kubenswrapper[4745]: E0319 00:08:32.340125 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:32 crc kubenswrapper[4745]: E0319 00:08:32.441235 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:32 crc kubenswrapper[4745]: E0319 00:08:32.541533 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:32 crc kubenswrapper[4745]: E0319 00:08:32.642333 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:32 crc kubenswrapper[4745]: E0319 00:08:32.742976 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:32 crc kubenswrapper[4745]: I0319 00:08:32.796284 4745 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 00:08:32 crc kubenswrapper[4745]: E0319 00:08:32.843697 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:32 crc kubenswrapper[4745]: E0319 00:08:32.944709 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:33 crc kubenswrapper[4745]: E0319 00:08:33.045798 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:33 crc kubenswrapper[4745]: E0319 00:08:33.146576 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:33 crc kubenswrapper[4745]: E0319 00:08:33.247284 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:33 crc kubenswrapper[4745]: E0319 00:08:33.348476 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:33 crc kubenswrapper[4745]: E0319 00:08:33.448868 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:33 crc kubenswrapper[4745]: E0319 00:08:33.549611 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:33 crc kubenswrapper[4745]: E0319 00:08:33.650461 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:33 crc kubenswrapper[4745]: E0319 00:08:33.750620 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:33 crc kubenswrapper[4745]: E0319 00:08:33.851550 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:33 crc kubenswrapper[4745]: E0319 00:08:33.952172 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:34 crc kubenswrapper[4745]: E0319 00:08:34.052803 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:34 crc kubenswrapper[4745]: E0319 00:08:34.153546 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:34 crc kubenswrapper[4745]: E0319 00:08:34.253997 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:34 crc kubenswrapper[4745]: E0319 00:08:34.354335 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:34 crc kubenswrapper[4745]: E0319 00:08:34.454470 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:34 crc kubenswrapper[4745]: E0319 00:08:34.555174 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:34 crc kubenswrapper[4745]: E0319 00:08:34.655560 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:34 crc kubenswrapper[4745]: E0319 00:08:34.756279 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:34 crc kubenswrapper[4745]: E0319 00:08:34.856866 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:34 crc kubenswrapper[4745]: E0319 00:08:34.957930 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.059080 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.160262 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.260732 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.361304 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.461987 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.562803 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.570927 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.576347 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.576376 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.576386 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.576404 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.576417 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:35Z","lastTransitionTime":"2026-03-19T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.587654 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.590662 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.590740 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.590753 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.590768 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.590783 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:35Z","lastTransitionTime":"2026-03-19T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.598965 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.602038 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.602082 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.602094 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.602116 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.602132 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:35Z","lastTransitionTime":"2026-03-19T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.610702 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.615229 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.615273 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.615283 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.615300 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.615644 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:35Z","lastTransitionTime":"2026-03-19T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.626960 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.627139 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.663457 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.763967 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.864484 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.965042 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.065425 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: I0319 00:08:36.137425 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:36 crc kubenswrapper[4745]: I0319 00:08:36.138828 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:36 crc kubenswrapper[4745]: I0319 00:08:36.138885 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:36 crc kubenswrapper[4745]: I0319 00:08:36.138937 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.166245 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.225291 4745 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.266813 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.367303 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.467592 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.568549 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.669470 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.770413 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.871300 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.971737 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.072688 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:37 crc kubenswrapper[4745]: I0319 00:08:37.137233 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:37 crc kubenswrapper[4745]: I0319 00:08:37.138403 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:37 crc kubenswrapper[4745]: I0319 00:08:37.138457 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:37 crc kubenswrapper[4745]: I0319 00:08:37.138475 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:37 crc kubenswrapper[4745]: I0319 00:08:37.139151 4745 scope.go:117] "RemoveContainer" containerID="a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.139369 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.173322 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.273804 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.374928 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.475061 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.575670 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.676665 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.777500 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.878411 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.979208 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:38 crc kubenswrapper[4745]: E0319 00:08:38.080381 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:38 crc kubenswrapper[4745]: E0319 00:08:38.180855 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:38 crc kubenswrapper[4745]: E0319 00:08:38.281332 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:38 crc kubenswrapper[4745]: E0319 00:08:38.381927 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:38 crc kubenswrapper[4745]: E0319 00:08:38.482579 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:38 crc kubenswrapper[4745]: E0319 00:08:38.583019 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:38 crc kubenswrapper[4745]: E0319 00:08:38.683696 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:38 crc kubenswrapper[4745]: E0319 00:08:38.784723 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:38 crc kubenswrapper[4745]: E0319 00:08:38.885108 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:38 crc kubenswrapper[4745]: E0319 00:08:38.985682 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:39 crc kubenswrapper[4745]: E0319 00:08:39.086453 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:39 crc kubenswrapper[4745]: E0319 00:08:39.186877 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:39 crc kubenswrapper[4745]: E0319 00:08:39.287630 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:39 crc kubenswrapper[4745]: E0319 00:08:39.387775 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:39 crc kubenswrapper[4745]: E0319 00:08:39.488652 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:39 crc kubenswrapper[4745]: E0319 00:08:39.589820 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:39 crc kubenswrapper[4745]: E0319 00:08:39.690188 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:39 crc kubenswrapper[4745]: E0319 00:08:39.790492 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:39 crc kubenswrapper[4745]: E0319 00:08:39.891644 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:39 crc kubenswrapper[4745]: E0319 00:08:39.992754 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:40 crc kubenswrapper[4745]: E0319 00:08:40.093621 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:40 crc kubenswrapper[4745]: E0319 00:08:40.194066 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:40 crc kubenswrapper[4745]: E0319 00:08:40.295062 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:40 crc kubenswrapper[4745]: E0319 00:08:40.395810 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:40 crc kubenswrapper[4745]: E0319 00:08:40.495956 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:40 crc kubenswrapper[4745]: E0319 00:08:40.597001 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:40 crc kubenswrapper[4745]: E0319 00:08:40.698100 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:40 crc kubenswrapper[4745]: E0319 00:08:40.798851 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:40 crc kubenswrapper[4745]: E0319 00:08:40.899460 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:40 crc kubenswrapper[4745]: E0319 00:08:40.999627 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:41 crc kubenswrapper[4745]: E0319 00:08:41.100311 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:41 crc kubenswrapper[4745]: E0319 00:08:41.200455 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:41 crc kubenswrapper[4745]: E0319 00:08:41.300795 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:41 crc kubenswrapper[4745]: E0319 00:08:41.401970 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:41 crc kubenswrapper[4745]: E0319 00:08:41.502691 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:41 crc kubenswrapper[4745]: E0319 00:08:41.603414 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:41 crc kubenswrapper[4745]: E0319 00:08:41.704261 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:41 crc kubenswrapper[4745]: E0319 00:08:41.805331 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:41 crc kubenswrapper[4745]: E0319 00:08:41.906506 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:42 crc kubenswrapper[4745]: E0319 00:08:42.007218 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:42 crc kubenswrapper[4745]: E0319 00:08:42.107721 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:42 crc kubenswrapper[4745]: E0319 00:08:42.208290 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:42 crc kubenswrapper[4745]: E0319 00:08:42.308648 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:42 crc kubenswrapper[4745]: E0319 00:08:42.409804 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:42 crc kubenswrapper[4745]: E0319 00:08:42.510350 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:42 crc kubenswrapper[4745]: E0319 00:08:42.611192 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:42 crc kubenswrapper[4745]: E0319 00:08:42.711460 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:42 crc kubenswrapper[4745]: E0319 00:08:42.812270 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:42 crc kubenswrapper[4745]: E0319 00:08:42.912643 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:43 crc kubenswrapper[4745]: E0319 00:08:43.012769 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:43 crc kubenswrapper[4745]: E0319 00:08:43.113206 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:43 crc kubenswrapper[4745]: E0319 00:08:43.214055 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:43 crc kubenswrapper[4745]: E0319 00:08:43.314877 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:43 crc kubenswrapper[4745]: E0319 00:08:43.415602 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:43 crc kubenswrapper[4745]: E0319 00:08:43.516661 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:43 crc kubenswrapper[4745]: E0319 00:08:43.617748 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:43 crc kubenswrapper[4745]: E0319 00:08:43.717945 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:43 crc kubenswrapper[4745]: E0319 00:08:43.818766 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:43 crc kubenswrapper[4745]: E0319 00:08:43.919199 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:44 crc kubenswrapper[4745]: E0319 00:08:44.019796 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:44 crc kubenswrapper[4745]: E0319 00:08:44.119989 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:44 crc kubenswrapper[4745]: E0319 00:08:44.220542 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:44 crc kubenswrapper[4745]: E0319 00:08:44.320908 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:44 crc kubenswrapper[4745]: E0319 00:08:44.421704 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:44 crc kubenswrapper[4745]: E0319 00:08:44.522248 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:44 crc kubenswrapper[4745]: E0319 00:08:44.623207 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.676364 4745 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.727405 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.727450 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.727466 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.727490 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.727506 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:44Z","lastTransitionTime":"2026-03-19T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.831450 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.831543 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.831562 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.831624 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.831646 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:44Z","lastTransitionTime":"2026-03-19T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.935225 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.935305 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.935325 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.935357 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.935379 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:44Z","lastTransitionTime":"2026-03-19T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.038552 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.038594 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.038608 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.038627 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.038642 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.112124 4745 apiserver.go:52] "Watching apiserver" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.119080 4745 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.119555 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-qt5t5","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-node-w2988","openshift-dns/node-resolver-5xqfc","openshift-multus/network-metrics-daemon-4r5k5","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-node-identity/network-node-identity-vrzqb","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j","openshift-image-registry/node-ca-xjkg8","openshift-multus/multus-additional-cni-plugins-n8tr6","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-multus/multus-mlwp7","openshift-network-diagnostics/network-check-target-xd92c"] Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.119983 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.120215 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.120417 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.120479 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.120515 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.120550 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.120607 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.120619 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5xqfc" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.120717 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.127595 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.127775 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.127948 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.128098 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.128425 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.129316 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.130403 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.130708 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.130870 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.132042 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.133584 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.133996 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.134024 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.134344 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.134639 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.141657 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.142358 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.142649 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.143167 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.143276 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.143398 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.143515 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.143614 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.143731 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.143829 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.143972 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.144075 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.144225 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.144390 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.144502 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.144597 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.144698 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.144818 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.145118 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.145255 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.145378 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.145493 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.145635 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.146484 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.146685 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.147078 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.147478 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.147472 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.147530 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.147535 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.149240 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.150063 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.150093 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.150103 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.150161 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.150171 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.162403 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.171426 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.184959 4745 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.185660 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.200923 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.212282 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.214813 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.214907 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.214943 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.214978 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215014 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215052 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215086 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215121 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215153 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215187 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215221 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215253 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215284 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215317 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215349 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215385 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215421 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215644 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215676 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215710 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215772 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215935 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215994 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216052 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216102 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216136 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216152 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216199 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216244 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216288 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216318 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216334 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216399 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216426 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216446 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216463 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216512 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216532 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216560 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216584 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216607 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216625 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216720 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216741 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216759 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216761 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216781 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216802 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216819 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216856 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216938 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216961 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216979 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216997 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217017 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217033 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217047 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217067 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217083 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217100 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217117 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217140 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217157 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217178 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217161 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217202 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217224 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217248 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217263 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217283 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217300 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217315 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217333 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217317 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217349 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217410 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217588 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217608 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217634 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217644 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217706 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217990 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.218108 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.218182 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.218559 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.218576 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.218605 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.218995 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.219291 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.219865 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.219993 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.220083 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.220124 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.220348 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.220404 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.220588 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.220659 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.220733 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.220865 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221114 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221119 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221226 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221262 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221276 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221334 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221391 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221417 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221442 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221488 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221535 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221597 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221649 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221699 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221712 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221746 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221751 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221798 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221908 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221958 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221993 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222037 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222141 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222153 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222182 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222225 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222289 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222326 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222361 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222338 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222391 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222399 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222527 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222567 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222639 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222700 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222769 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222805 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222809 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222839 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222843 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222902 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222909 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222963 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222998 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223027 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223112 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223146 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223177 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223207 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223237 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223270 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223424 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223460 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223531 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223564 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223595 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223631 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223663 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223767 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223805 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223836 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223861 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223904 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223928 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223951 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223973 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223991 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224012 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224031 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224051 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224078 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224098 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224117 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224136 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224152 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224171 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224189 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224214 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224233 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224253 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224273 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224295 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224313 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224339 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224364 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224393 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224417 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224445 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224469 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224498 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224525 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224552 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224573 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224592 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224612 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224632 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224651 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224680 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224703 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224725 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224744 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224762 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224782 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224803 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224821 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224840 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224861 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224905 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224933 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224956 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224975 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224994 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225012 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225032 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225050 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225144 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225171 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225199 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.225246 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:08:45.725212724 +0000 UTC m=+90.263407865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225294 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225331 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225358 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225384 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225410 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225434 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225466 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225492 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225516 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225542 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225567 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225592 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225618 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225644 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225687 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225726 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225763 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225788 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225819 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225851 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225919 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226036 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226076 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/09cb2800-ce49-44cf-89b5-d1e5459299c5-serviceca\") pod \"node-ca-xjkg8\" (UID: \"09cb2800-ce49-44cf-89b5-d1e5459299c5\") " pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226110 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7kx7\" (UniqueName: \"kubernetes.io/projected/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-kube-api-access-j7kx7\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226138 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/400972f4-050f-4f26-b982-ced6f2590c8b-proxy-tls\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226159 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-cni-dir\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226181 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-cni-binary-copy\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226209 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226239 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226290 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-var-lib-openvswitch\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226320 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-node-log\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226342 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-bin\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226371 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226402 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-run-multus-certs\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226428 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-etc-kubernetes\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226460 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwglz\" (UniqueName: \"kubernetes.io/projected/21835778-c889-4031-b630-586c00f200f9-kube-api-access-kwglz\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226488 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4250eab-9d3c-457f-9a78-50400c5f65f3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226515 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-var-lib-kubelet\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226537 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-kubelet\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226560 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-slash\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226581 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-config\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226605 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-os-release\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226632 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226655 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-run-k8s-cni-cncf-io\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226676 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-log-socket\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226697 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-env-overrides\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226723 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226746 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09cb2800-ce49-44cf-89b5-d1e5459299c5-host\") pod \"node-ca-xjkg8\" (UID: \"09cb2800-ce49-44cf-89b5-d1e5459299c5\") " pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226769 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226790 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226812 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-systemd\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226839 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226863 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226911 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbtfl\" (UniqueName: \"kubernetes.io/projected/09cb2800-ce49-44cf-89b5-d1e5459299c5-kube-api-access-gbtfl\") pod \"node-ca-xjkg8\" (UID: \"09cb2800-ce49-44cf-89b5-d1e5459299c5\") " pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226940 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226963 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-socket-dir-parent\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226986 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-daemon-config\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227009 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-systemd-units\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227033 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-etc-openvswitch\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227055 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-conf-dir\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227077 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p582k\" (UniqueName: \"kubernetes.io/projected/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-kube-api-access-p582k\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227103 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227128 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwl4z\" (UniqueName: \"kubernetes.io/projected/9ddaa87b-caf7-46de-b693-96c60909d05e-kube-api-access-qwl4z\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227149 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-run-netns\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227172 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-script-lib\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227209 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48rc4\" (UniqueName: \"kubernetes.io/projected/33ce3f8d-5035-4139-b206-f3c36e53618c-kube-api-access-48rc4\") pod \"node-resolver-5xqfc\" (UID: \"33ce3f8d-5035-4139-b206-f3c36e53618c\") " pod="openshift-dns/node-resolver-5xqfc" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227240 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-system-cni-dir\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227261 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-os-release\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227284 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-var-lib-cni-bin\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227304 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-openvswitch\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227335 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-system-cni-dir\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227366 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ddaa87b-caf7-46de-b693-96c60909d05e-cni-binary-copy\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227398 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-ovn\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227432 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227470 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227502 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9ddaa87b-caf7-46de-b693-96c60909d05e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227533 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/400972f4-050f-4f26-b982-ced6f2590c8b-rootfs\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227562 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-var-lib-cni-multus\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227593 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4250eab-9d3c-457f-9a78-50400c5f65f3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227621 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-cnibin\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227647 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95ftv\" (UniqueName: \"kubernetes.io/projected/c4250eab-9d3c-457f-9a78-50400c5f65f3-kube-api-access-95ftv\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227669 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxx4n\" (UniqueName: \"kubernetes.io/projected/400972f4-050f-4f26-b982-ced6f2590c8b-kube-api-access-jxx4n\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227706 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-hostroot\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227728 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-netns\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227752 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/33ce3f8d-5035-4139-b206-f3c36e53618c-hosts-file\") pod \"node-resolver-5xqfc\" (UID: \"33ce3f8d-5035-4139-b206-f3c36e53618c\") " pod="openshift-dns/node-resolver-5xqfc" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227776 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4250eab-9d3c-457f-9a78-50400c5f65f3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227803 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227829 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21835778-c889-4031-b630-586c00f200f9-ovn-node-metrics-cert\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227855 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227941 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/400972f4-050f-4f26-b982-ced6f2590c8b-mcd-auth-proxy-config\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227987 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-cnibin\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228022 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-ovn-kubernetes\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228054 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-netd\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228081 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228175 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228192 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228209 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228225 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228239 4745 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228259 4745 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228273 4745 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228287 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228301 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228315 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228328 4745 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228342 4745 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228355 4745 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228370 4745 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228384 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228398 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228413 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228428 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228442 4745 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228456 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228470 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228486 4745 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228501 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228516 4745 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228529 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228543 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228557 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228573 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228589 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228617 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228644 4745 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228668 4745 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228689 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228708 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228726 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228749 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228766 4745 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228785 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228980 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.229000 4745 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.229089 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.229109 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.229128 4745 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.229147 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.229120 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.229755 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.229795 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.229858 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.230090 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.230132 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.230361 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.231353 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.231632 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.231944 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.232576 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.233016 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.234389 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.234596 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.234687 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.234820 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.235064 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.235248 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.235261 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.235592 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.235570 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.236116 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.236165 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.236480 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.236571 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.236860 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.236871 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.237100 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.237101 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.237147 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.237150 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.237375 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.237485 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.237523 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.237991 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.238145 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.238172 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.238334 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.238524 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.238545 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.238637 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.239016 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.239029 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.239035 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.239180 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.239862 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.240044 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.242063 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.240062 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.240199 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.240306 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.240311 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.240381 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.241518 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.241951 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.242024 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.242303 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.242448 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.242775 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.243402 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.243459 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.243993 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.244013 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.244397 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.244715 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.244421 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.244789 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.245112 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.245354 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.245436 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.245598 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.245896 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.246050 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.246105 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.246700 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.247471 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.248061 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.248177 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.248181 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.248243 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.248267 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.248722 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.248817 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.249023 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.249072 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.251602 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.252022 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.252530 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.252678 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.252957 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.252235 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.253348 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.253416 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.253406 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.253520 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.253771 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.253874 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.253936 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:45.753906977 +0000 UTC m=+90.292102118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.254515 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.254743 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.254946 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.255202 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.255658 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.255760 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.256187 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.256484 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.256791 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.257061 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.257255 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:45.757211114 +0000 UTC m=+90.295406285 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.257626 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.257857 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.258214 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.258303 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.259617 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.259849 4745 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.259862 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.260312 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.260696 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.261574 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.261966 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.262342 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.262438 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.262485 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.262806 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.262866 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.263013 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.261271 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.265369 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.273202 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.276946 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.261828 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.271299 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.274875 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.277827 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.277842 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.277864 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.277877 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.277906 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.272119 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.267433 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.268159 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.272175 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.272777 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.273030 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.274024 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.274130 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.274587 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.275093 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.275125 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.278541 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.278566 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.275548 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.260856 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.277182 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.277480 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.278711 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:45.778650463 +0000 UTC m=+90.316845594 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.280556 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.280593 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.280609 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.280680 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:45.780658157 +0000 UTC m=+90.318853298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.281007 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.282483 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.282600 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.282678 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.283120 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.285455 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.292648 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.293167 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.293377 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.293477 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.294512 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.294637 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.294474 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.294994 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.295015 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.295897 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.306093 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.307606 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.311217 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.318949 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.319747 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329105 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329581 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-system-cni-dir\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329619 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-os-release\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329641 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwl4z\" (UniqueName: \"kubernetes.io/projected/9ddaa87b-caf7-46de-b693-96c60909d05e-kube-api-access-qwl4z\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329737 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-os-release\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329735 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-system-cni-dir\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329777 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329800 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-run-netns\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329821 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-script-lib\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329841 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48rc4\" (UniqueName: \"kubernetes.io/projected/33ce3f8d-5035-4139-b206-f3c36e53618c-kube-api-access-48rc4\") pod \"node-resolver-5xqfc\" (UID: \"33ce3f8d-5035-4139-b206-f3c36e53618c\") " pod="openshift-dns/node-resolver-5xqfc" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329860 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-system-cni-dir\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329857 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-run-netns\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.330032 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-system-cni-dir\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.330706 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-script-lib\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332067 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-var-lib-cni-bin\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332103 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-openvswitch\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332147 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ddaa87b-caf7-46de-b693-96c60909d05e-cni-binary-copy\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332167 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-ovn\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332189 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4250eab-9d3c-457f-9a78-50400c5f65f3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332210 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-cnibin\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332228 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9ddaa87b-caf7-46de-b693-96c60909d05e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332284 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/400972f4-050f-4f26-b982-ced6f2590c8b-rootfs\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332304 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-var-lib-cni-multus\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332326 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4250eab-9d3c-457f-9a78-50400c5f65f3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332345 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95ftv\" (UniqueName: \"kubernetes.io/projected/c4250eab-9d3c-457f-9a78-50400c5f65f3-kube-api-access-95ftv\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332372 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxx4n\" (UniqueName: \"kubernetes.io/projected/400972f4-050f-4f26-b982-ced6f2590c8b-kube-api-access-jxx4n\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332390 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-hostroot\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332406 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-netns\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332424 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/33ce3f8d-5035-4139-b206-f3c36e53618c-hosts-file\") pod \"node-resolver-5xqfc\" (UID: \"33ce3f8d-5035-4139-b206-f3c36e53618c\") " pod="openshift-dns/node-resolver-5xqfc" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332457 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21835778-c889-4031-b630-586c00f200f9-ovn-node-metrics-cert\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332477 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332494 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/400972f4-050f-4f26-b982-ced6f2590c8b-mcd-auth-proxy-config\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332512 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-cnibin\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332530 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-ovn-kubernetes\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332546 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-netd\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332564 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332600 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/09cb2800-ce49-44cf-89b5-d1e5459299c5-serviceca\") pod \"node-ca-xjkg8\" (UID: \"09cb2800-ce49-44cf-89b5-d1e5459299c5\") " pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332618 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7kx7\" (UniqueName: \"kubernetes.io/projected/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-kube-api-access-j7kx7\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332642 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/400972f4-050f-4f26-b982-ced6f2590c8b-proxy-tls\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332659 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-cni-dir\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332685 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-cni-binary-copy\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332705 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332736 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-run-multus-certs\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332754 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-var-lib-openvswitch\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332770 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-node-log\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332787 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-bin\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332821 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4250eab-9d3c-457f-9a78-50400c5f65f3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332839 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-var-lib-kubelet\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332856 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-etc-kubernetes\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332872 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwglz\" (UniqueName: \"kubernetes.io/projected/21835778-c889-4031-b630-586c00f200f9-kube-api-access-kwglz\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332909 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-os-release\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332932 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-kubelet\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332951 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-slash\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332967 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-config\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332997 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09cb2800-ce49-44cf-89b5-d1e5459299c5-host\") pod \"node-ca-xjkg8\" (UID: \"09cb2800-ce49-44cf-89b5-d1e5459299c5\") " pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333028 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-run-k8s-cni-cncf-io\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333043 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-log-socket\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333044 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-var-lib-cni-bin\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333069 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-openvswitch\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333565 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-netns\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333608 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/33ce3f8d-5035-4139-b206-f3c36e53618c-hosts-file\") pod \"node-resolver-5xqfc\" (UID: \"33ce3f8d-5035-4139-b206-f3c36e53618c\") " pod="openshift-dns/node-resolver-5xqfc" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333628 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-ovn\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333635 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-env-overrides\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333645 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-var-lib-kubelet\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333687 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333706 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-ovn-kubernetes\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333709 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333750 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-systemd\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333784 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbtfl\" (UniqueName: \"kubernetes.io/projected/09cb2800-ce49-44cf-89b5-d1e5459299c5-kube-api-access-gbtfl\") pod \"node-ca-xjkg8\" (UID: \"09cb2800-ce49-44cf-89b5-d1e5459299c5\") " pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333819 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-socket-dir-parent\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333838 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-daemon-config\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333856 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-systemd-units\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.333863 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333873 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-etc-openvswitch\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333934 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-etc-openvswitch\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333963 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-run-k8s-cni-cncf-io\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333966 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ddaa87b-caf7-46de-b693-96c60909d05e-cni-binary-copy\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333994 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-log-socket\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.334002 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs podName:5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:45.833907999 +0000 UTC m=+90.372103120 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs") pod "network-metrics-daemon-4r5k5" (UID: "5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.334036 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-conf-dir\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.334059 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p582k\" (UniqueName: \"kubernetes.io/projected/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-kube-api-access-p582k\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.334279 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-config\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.334385 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-env-overrides\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333690 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-cni-dir\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.334536 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-os-release\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.334871 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.334947 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-conf-dir\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335096 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-kubelet\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335246 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/400972f4-050f-4f26-b982-ced6f2590c8b-rootfs\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335352 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-systemd\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335377 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335475 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/400972f4-050f-4f26-b982-ced6f2590c8b-mcd-auth-proxy-config\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335524 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-cnibin\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335550 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-var-lib-openvswitch\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335553 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4250eab-9d3c-457f-9a78-50400c5f65f3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335588 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335591 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-hostroot\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335607 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-netd\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335630 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335583 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-cnibin\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335926 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-run-multus-certs\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335950 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-bin\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335968 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-node-log\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.336026 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-etc-kubernetes\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.336062 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-slash\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.336087 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-var-lib-cni-multus\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.336613 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/09cb2800-ce49-44cf-89b5-d1e5459299c5-serviceca\") pod \"node-ca-xjkg8\" (UID: \"09cb2800-ce49-44cf-89b5-d1e5459299c5\") " pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.336631 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09cb2800-ce49-44cf-89b5-d1e5459299c5-host\") pod \"node-ca-xjkg8\" (UID: \"09cb2800-ce49-44cf-89b5-d1e5459299c5\") " pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337253 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-daemon-config\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337289 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-systemd-units\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337449 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337465 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337476 4745 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337486 4745 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337498 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337509 4745 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337521 4745 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337533 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337547 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337558 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337569 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337583 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337597 4745 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337613 4745 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337626 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337640 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337655 4745 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337670 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337697 4745 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337716 4745 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337730 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337744 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337755 4745 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337765 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337776 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337787 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337798 4745 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337807 4745 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337819 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337828 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337839 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337850 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337861 4745 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337870 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337882 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337907 4745 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340066 4745 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340082 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340468 4745 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340480 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340491 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340082 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-socket-dir-parent\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340543 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340556 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340566 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340576 4745 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.339923 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/400972f4-050f-4f26-b982-ced6f2590c8b-proxy-tls\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340585 4745 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340632 4745 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340643 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340654 4745 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340665 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340676 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340687 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340700 4745 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340709 4745 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340720 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340720 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9ddaa87b-caf7-46de-b693-96c60909d05e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340731 4745 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340803 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340821 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340841 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340860 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340926 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340942 4745 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340956 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340972 4745 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340985 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340999 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341013 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341027 4745 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341044 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341057 4745 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341070 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341084 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341100 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341114 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341129 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341145 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341159 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341174 4745 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341187 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341200 4745 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341214 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341232 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341248 4745 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341264 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341280 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341295 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341310 4745 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341331 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341347 4745 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341362 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341379 4745 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341398 4745 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341413 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341429 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341447 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341461 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341477 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341494 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341517 4745 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341541 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341559 4745 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341574 4745 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341588 4745 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341603 4745 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341619 4745 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341634 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341649 4745 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341665 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341680 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341696 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341711 4745 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341728 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341741 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341754 4745 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341772 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341786 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341800 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341813 4745 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341827 4745 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341843 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341857 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341872 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341909 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341924 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341937 4745 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341951 4745 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341967 4745 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341984 4745 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341997 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342012 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342029 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342049 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342070 4745 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342092 4745 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342110 4745 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342127 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342146 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342168 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342188 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342205 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342223 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342241 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342260 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342278 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342298 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342314 4745 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342328 4745 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342340 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342355 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342372 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.343196 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-cni-binary-copy\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.346008 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4250eab-9d3c-457f-9a78-50400c5f65f3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.348627 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.348697 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21835778-c889-4031-b630-586c00f200f9-ovn-node-metrics-cert\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.352242 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4250eab-9d3c-457f-9a78-50400c5f65f3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.352508 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48rc4\" (UniqueName: \"kubernetes.io/projected/33ce3f8d-5035-4139-b206-f3c36e53618c-kube-api-access-48rc4\") pod \"node-resolver-5xqfc\" (UID: \"33ce3f8d-5035-4139-b206-f3c36e53618c\") " pod="openshift-dns/node-resolver-5xqfc" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.352628 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p582k\" (UniqueName: \"kubernetes.io/projected/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-kube-api-access-p582k\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.353272 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxx4n\" (UniqueName: \"kubernetes.io/projected/400972f4-050f-4f26-b982-ced6f2590c8b-kube-api-access-jxx4n\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.353363 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95ftv\" (UniqueName: \"kubernetes.io/projected/c4250eab-9d3c-457f-9a78-50400c5f65f3-kube-api-access-95ftv\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.354599 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwglz\" (UniqueName: \"kubernetes.io/projected/21835778-c889-4031-b630-586c00f200f9-kube-api-access-kwglz\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.354703 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwl4z\" (UniqueName: \"kubernetes.io/projected/9ddaa87b-caf7-46de-b693-96c60909d05e-kube-api-access-qwl4z\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.361081 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.361264 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbtfl\" (UniqueName: \"kubernetes.io/projected/09cb2800-ce49-44cf-89b5-d1e5459299c5-kube-api-access-gbtfl\") pod \"node-ca-xjkg8\" (UID: \"09cb2800-ce49-44cf-89b5-d1e5459299c5\") " pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.365325 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7kx7\" (UniqueName: \"kubernetes.io/projected/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-kube-api-access-j7kx7\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.372780 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.381454 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.381497 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.381509 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.381528 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.381539 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.483940 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.483987 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.483999 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.484020 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.484033 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.488536 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.506459 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 00:08:45 crc kubenswrapper[4745]: W0319 00:08:45.513433 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-55ea0674e3b76510e05bafdccd218a77a0ea7c853b2c0c75f5d72a466c8c2f3e WatchSource:0}: Error finding container 55ea0674e3b76510e05bafdccd218a77a0ea7c853b2c0c75f5d72a466c8c2f3e: Status 404 returned error can't find the container with id 55ea0674e3b76510e05bafdccd218a77a0ea7c853b2c0c75f5d72a466c8c2f3e Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.516744 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 00:08:45 crc kubenswrapper[4745]: W0319 00:08:45.523324 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-74b0ab61cf5fe675ad73845b389f449bbbb658343ca438f2414f49c2ba26dd44 WatchSource:0}: Error finding container 74b0ab61cf5fe675ad73845b389f449bbbb658343ca438f2414f49c2ba26dd44: Status 404 returned error can't find the container with id 74b0ab61cf5fe675ad73845b389f449bbbb658343ca438f2414f49c2ba26dd44 Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.524549 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: W0319 00:08:45.544305 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-f14165b3d1142a91332943932f4b2d00b2dede8c068833322009828e9146351d WatchSource:0}: Error finding container f14165b3d1142a91332943932f4b2d00b2dede8c068833322009828e9146351d: Status 404 returned error can't find the container with id f14165b3d1142a91332943932f4b2d00b2dede8c068833322009828e9146351d Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.548324 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5xqfc" Mar 19 00:08:45 crc kubenswrapper[4745]: W0319 00:08:45.558453 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4250eab_9d3c_457f_9a78_50400c5f65f3.slice/crio-33d82be740bc346a2ac510dadefc479fc4b2de8995038ba6d03c4e33bb0019bd WatchSource:0}: Error finding container 33d82be740bc346a2ac510dadefc479fc4b2de8995038ba6d03c4e33bb0019bd: Status 404 returned error can't find the container with id 33d82be740bc346a2ac510dadefc479fc4b2de8995038ba6d03c4e33bb0019bd Mar 19 00:08:45 crc kubenswrapper[4745]: W0319 00:08:45.570248 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33ce3f8d_5035_4139_b206_f3c36e53618c.slice/crio-4a57c74bb0019820841aa742d37b2aa0a09fefb850a3631365af5e1d25baf876 WatchSource:0}: Error finding container 4a57c74bb0019820841aa742d37b2aa0a09fefb850a3631365af5e1d25baf876: Status 404 returned error can't find the container with id 4a57c74bb0019820841aa742d37b2aa0a09fefb850a3631365af5e1d25baf876 Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.576876 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5xqfc" event={"ID":"33ce3f8d-5035-4139-b206-f3c36e53618c","Type":"ContainerStarted","Data":"4a57c74bb0019820841aa742d37b2aa0a09fefb850a3631365af5e1d25baf876"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.577825 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" event={"ID":"c4250eab-9d3c-457f-9a78-50400c5f65f3","Type":"ContainerStarted","Data":"33d82be740bc346a2ac510dadefc479fc4b2de8995038ba6d03c4e33bb0019bd"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.578693 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f14165b3d1142a91332943932f4b2d00b2dede8c068833322009828e9146351d"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.579291 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.580384 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"74b0ab61cf5fe675ad73845b389f449bbbb658343ca438f2414f49c2ba26dd44"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.587782 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"55ea0674e3b76510e05bafdccd218a77a0ea7c853b2c0c75f5d72a466c8c2f3e"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.587997 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.588085 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.588119 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.588159 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.588187 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.604819 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.612195 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.620308 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.626259 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: W0319 00:08:45.653328 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ddaa87b_caf7_46de_b693_96c60909d05e.slice/crio-b364a57f24f84cb381aa5f730ab98cd9e7f0feb70fc73842d17c5d170af415fe WatchSource:0}: Error finding container b364a57f24f84cb381aa5f730ab98cd9e7f0feb70fc73842d17c5d170af415fe: Status 404 returned error can't find the container with id b364a57f24f84cb381aa5f730ab98cd9e7f0feb70fc73842d17c5d170af415fe Mar 19 00:08:45 crc kubenswrapper[4745]: W0319 00:08:45.663345 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod400972f4_050f_4f26_b982_ced6f2590c8b.slice/crio-232dec797e251d88e57638877e07098e41d7808375648a38c5a284c7b37f1066 WatchSource:0}: Error finding container 232dec797e251d88e57638877e07098e41d7808375648a38c5a284c7b37f1066: Status 404 returned error can't find the container with id 232dec797e251d88e57638877e07098e41d7808375648a38c5a284c7b37f1066 Mar 19 00:08:45 crc kubenswrapper[4745]: W0319 00:08:45.682678 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a0ae9c0_f19a_4038_be03_0fa6d223ebbf.slice/crio-e0f38711796ea26c0b48e62abf0504374426cbbf4e9f3e84e32b0853dcb4d933 WatchSource:0}: Error finding container e0f38711796ea26c0b48e62abf0504374426cbbf4e9f3e84e32b0853dcb4d933: Status 404 returned error can't find the container with id e0f38711796ea26c0b48e62abf0504374426cbbf4e9f3e84e32b0853dcb4d933 Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.691075 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.691123 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.691136 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.691158 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.691177 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.719433 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.719488 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.719501 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.719523 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.719542 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.730629 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.735738 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.735774 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.735787 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.735806 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.735822 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.751232 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.751583 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:08:46.751556768 +0000 UTC m=+91.289751899 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.762523 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.777718 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.777778 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.777796 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.777820 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.777836 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.788013 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.793638 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.793676 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.793690 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.793706 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.793719 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.805088 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.810424 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.810478 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.810490 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.810513 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.810534 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.819939 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.820053 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.822129 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.822168 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.822186 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.822207 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.822222 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.852389 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.852643 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.852694 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.852727 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.852754 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.852907 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.852969 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs podName:5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:46.852949998 +0000 UTC m=+91.391145129 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs") pod "network-metrics-daemon-4r5k5" (UID: "5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853057 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853084 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853097 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853130 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:46.853121544 +0000 UTC m=+91.391316675 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853178 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853209 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:46.853199697 +0000 UTC m=+91.391394838 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853256 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853285 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:46.853277009 +0000 UTC m=+91.391472140 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853345 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853359 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853371 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853400 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:46.853391153 +0000 UTC m=+91.391586284 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.924325 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.924363 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.924372 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.924390 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.924401 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.026709 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.026980 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.027062 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.027137 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.027202 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:46Z","lastTransitionTime":"2026-03-19T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.129472 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.129752 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.129852 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.130092 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.130349 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:46Z","lastTransitionTime":"2026-03-19T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.141455 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.142710 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.143936 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.144630 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.145729 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.146396 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.147131 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.148244 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.148938 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.149935 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.150543 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.151663 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.152586 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.152771 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.153289 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.154369 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.156180 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.157227 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.157689 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.158563 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.159658 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.160252 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.161787 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.162341 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.162514 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.163594 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.164321 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.165323 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.166608 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.167201 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.168368 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.169115 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.170116 4745 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.170322 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.172510 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.173367 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.173973 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.174579 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.176206 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.183095 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.184198 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.186452 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.187437 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.188773 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.189792 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.191358 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.191769 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.192642 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.193845 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.194655 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.196138 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.197132 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.198740 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.199340 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.199774 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.201216 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.202153 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.202797 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.203816 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.209788 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.218393 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.227942 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.239008 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.239074 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.239095 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.239124 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.239144 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:46Z","lastTransitionTime":"2026-03-19T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.245369 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.256902 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.265474 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.279847 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.293643 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.310909 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.342337 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.342390 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.342399 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.342418 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.342430 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:46Z","lastTransitionTime":"2026-03-19T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.445448 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.445688 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.445700 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.445726 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.445739 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:46Z","lastTransitionTime":"2026-03-19T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.548629 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.548669 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.548680 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.548698 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.548711 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:46Z","lastTransitionTime":"2026-03-19T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.593739 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.595018 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d" exitCode=0 Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.595094 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.595283 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"71f66e30efa5016ee954a4cb19c576186a237cdc85750ce0837af353f57d6b56"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.596408 4745 generic.go:334] "Generic (PLEG): container finished" podID="9ddaa87b-caf7-46de-b693-96c60909d05e" containerID="f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771" exitCode=0 Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.596474 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" event={"ID":"9ddaa87b-caf7-46de-b693-96c60909d05e","Type":"ContainerDied","Data":"f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.596641 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" event={"ID":"9ddaa87b-caf7-46de-b693-96c60909d05e","Type":"ContainerStarted","Data":"b364a57f24f84cb381aa5f730ab98cd9e7f0feb70fc73842d17c5d170af415fe"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.601945 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" event={"ID":"c4250eab-9d3c-457f-9a78-50400c5f65f3","Type":"ContainerStarted","Data":"dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.603003 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" event={"ID":"c4250eab-9d3c-457f-9a78-50400c5f65f3","Type":"ContainerStarted","Data":"bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.603727 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mlwp7" event={"ID":"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf","Type":"ContainerStarted","Data":"7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.603779 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mlwp7" event={"ID":"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf","Type":"ContainerStarted","Data":"e0f38711796ea26c0b48e62abf0504374426cbbf4e9f3e84e32b0853dcb4d933"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.607567 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.607612 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.607630 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"232dec797e251d88e57638877e07098e41d7808375648a38c5a284c7b37f1066"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.609505 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xjkg8" event={"ID":"09cb2800-ce49-44cf-89b5-d1e5459299c5","Type":"ContainerStarted","Data":"a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.609559 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xjkg8" event={"ID":"09cb2800-ce49-44cf-89b5-d1e5459299c5","Type":"ContainerStarted","Data":"46c60a430906056106f7c91de97ac2cc34eae6186a8498f8e398e75fe6cd086e"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.609645 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.611190 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5xqfc" event={"ID":"33ce3f8d-5035-4139-b206-f3c36e53618c","Type":"ContainerStarted","Data":"11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.614465 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.614537 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.631439 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.646376 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.652047 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.652118 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.652140 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.652163 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.652177 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:46Z","lastTransitionTime":"2026-03-19T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.658537 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.668163 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.678164 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.691760 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.707004 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.722165 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.737985 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.755612 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.755631 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.755757 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.755773 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.755796 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.755812 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:46Z","lastTransitionTime":"2026-03-19T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.760647 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.761978 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:08:48.761953216 +0000 UTC m=+93.300148337 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.769673 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.784173 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.804410 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.821990 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.840529 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.851839 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.859182 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.859250 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.859267 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.859290 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.859311 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:46Z","lastTransitionTime":"2026-03-19T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.862817 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.862859 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.862906 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.862929 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.862950 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863033 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863070 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863093 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:48.863065207 +0000 UTC m=+93.401260338 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863156 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:48.863136979 +0000 UTC m=+93.401332110 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863159 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863185 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863191 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863199 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863215 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863245 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863245 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:48.863237882 +0000 UTC m=+93.401433013 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863215 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863390 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs podName:5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:48.863337666 +0000 UTC m=+93.401532807 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs") pod "network-metrics-daemon-4r5k5" (UID: "5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863419 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:48.863405908 +0000 UTC m=+93.401601129 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.870655 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.885938 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.899036 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.917384 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.936404 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.954286 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.961767 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.961806 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.961819 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.961840 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.961853 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:46Z","lastTransitionTime":"2026-03-19T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.970150 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.986083 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.009194 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.025248 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.053254 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.064855 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.064922 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.064935 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.064958 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.064985 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:47Z","lastTransitionTime":"2026-03-19T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.137161 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.137233 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.137288 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.137324 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:47 crc kubenswrapper[4745]: E0319 00:08:47.137363 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:08:47 crc kubenswrapper[4745]: E0319 00:08:47.137532 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:08:47 crc kubenswrapper[4745]: E0319 00:08:47.137653 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:08:47 crc kubenswrapper[4745]: E0319 00:08:47.137773 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.167369 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.167418 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.167431 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.167452 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.167466 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:47Z","lastTransitionTime":"2026-03-19T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.270396 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.270459 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.270473 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.270495 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.270511 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:47Z","lastTransitionTime":"2026-03-19T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.373222 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.373764 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.373782 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.373809 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.373826 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:47Z","lastTransitionTime":"2026-03-19T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.476091 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.476138 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.476151 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.476173 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.476184 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:47Z","lastTransitionTime":"2026-03-19T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.579664 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.579881 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.579988 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.580064 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.580125 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:47Z","lastTransitionTime":"2026-03-19T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.622297 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.622344 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.622356 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.622366 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.622376 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.622385 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.624808 4745 generic.go:334] "Generic (PLEG): container finished" podID="9ddaa87b-caf7-46de-b693-96c60909d05e" containerID="9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9" exitCode=0 Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.625112 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" event={"ID":"9ddaa87b-caf7-46de-b693-96c60909d05e","Type":"ContainerDied","Data":"9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.641468 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.661818 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.679974 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.684473 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.684518 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.684530 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.684555 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.684566 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:47Z","lastTransitionTime":"2026-03-19T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.695056 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.709701 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.723599 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.740001 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.754724 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.764278 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.787829 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.787911 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.787925 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.787944 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.787956 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:47Z","lastTransitionTime":"2026-03-19T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.793795 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.806957 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.818140 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.832420 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.847163 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.891171 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.891257 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.891270 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.891292 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.891305 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:47Z","lastTransitionTime":"2026-03-19T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.994927 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.994975 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.994984 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.995000 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.995012 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:47Z","lastTransitionTime":"2026-03-19T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.098683 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.098725 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.098739 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.098759 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.098827 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:48Z","lastTransitionTime":"2026-03-19T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.204077 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.204115 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.204129 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.204145 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.204157 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:48Z","lastTransitionTime":"2026-03-19T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.307562 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.307600 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.307611 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.307632 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.307642 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:48Z","lastTransitionTime":"2026-03-19T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.410034 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.410078 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.410088 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.410104 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.410116 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:48Z","lastTransitionTime":"2026-03-19T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.520384 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.520440 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.520455 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.520489 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.520501 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:48Z","lastTransitionTime":"2026-03-19T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.624374 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.624780 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.624798 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.624828 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.624847 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:48Z","lastTransitionTime":"2026-03-19T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.630964 4745 generic.go:334] "Generic (PLEG): container finished" podID="9ddaa87b-caf7-46de-b693-96c60909d05e" containerID="bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc" exitCode=0 Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.631054 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" event={"ID":"9ddaa87b-caf7-46de-b693-96c60909d05e","Type":"ContainerDied","Data":"bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc"} Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.662769 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.678342 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.695713 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.713573 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.727832 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.727874 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.727915 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.727935 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.727953 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:48Z","lastTransitionTime":"2026-03-19T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.730403 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.749707 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.762600 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.776493 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.782242 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.782969 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:08:52.782944137 +0000 UTC m=+97.321139268 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.787911 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.797708 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.821049 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.830914 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.830950 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.830962 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.830986 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.831002 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:48Z","lastTransitionTime":"2026-03-19T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.836827 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.848970 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.859705 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.883084 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.883131 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.883155 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.883175 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.883194 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883305 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883343 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:52.883331086 +0000 UTC m=+97.421526217 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883401 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883422 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:52.883416378 +0000 UTC m=+97.421611499 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883480 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883492 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883504 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883525 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:52.883519842 +0000 UTC m=+97.421714973 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883560 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883580 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs podName:5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:52.883573642 +0000 UTC m=+97.421768773 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs") pod "network-metrics-daemon-4r5k5" (UID: "5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883619 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883628 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883637 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883655 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:52.883649605 +0000 UTC m=+97.421844736 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.934048 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.934085 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.934093 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.934110 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.934122 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:48Z","lastTransitionTime":"2026-03-19T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.037656 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.037697 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.037708 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.037723 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.037734 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:49Z","lastTransitionTime":"2026-03-19T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.137681 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.137972 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:49 crc kubenswrapper[4745]: E0319 00:08:49.138044 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:08:49 crc kubenswrapper[4745]: E0319 00:08:49.137963 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.138159 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.138281 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:49 crc kubenswrapper[4745]: E0319 00:08:49.138393 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:08:49 crc kubenswrapper[4745]: E0319 00:08:49.138496 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.145694 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.145739 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.145758 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.145791 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.145812 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:49Z","lastTransitionTime":"2026-03-19T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.249252 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.249298 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.249313 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.249336 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.249352 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:49Z","lastTransitionTime":"2026-03-19T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.352258 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.352310 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.352321 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.352344 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.352358 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:49Z","lastTransitionTime":"2026-03-19T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.455722 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.455795 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.455823 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.455860 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.455919 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:49Z","lastTransitionTime":"2026-03-19T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.558509 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.558555 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.558565 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.558585 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.558595 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:49Z","lastTransitionTime":"2026-03-19T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.635676 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.639757 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.643332 4745 generic.go:334] "Generic (PLEG): container finished" podID="9ddaa87b-caf7-46de-b693-96c60909d05e" containerID="61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d" exitCode=0 Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.643395 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" event={"ID":"9ddaa87b-caf7-46de-b693-96c60909d05e","Type":"ContainerDied","Data":"61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.658063 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.661208 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.661255 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.661268 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.661289 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.661307 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:49Z","lastTransitionTime":"2026-03-19T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.674955 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.691931 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.715624 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.728526 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.742015 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.754953 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.764489 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.764572 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.764592 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.764623 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.764640 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:49Z","lastTransitionTime":"2026-03-19T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.783052 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.801113 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.814255 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.829709 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.844299 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.859122 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.869103 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.869152 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.869166 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.869191 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.869207 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:49Z","lastTransitionTime":"2026-03-19T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.904354 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.918907 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.929124 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.940651 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.952731 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.976291 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.983405 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.983436 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.983445 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.983459 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.983471 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:49Z","lastTransitionTime":"2026-03-19T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.003719 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.042283 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.065496 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.082044 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.086703 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.086769 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.086780 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.086797 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.086807 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:50Z","lastTransitionTime":"2026-03-19T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.096162 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.109578 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.123734 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.139606 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.150276 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.150516 4745 scope.go:117] "RemoveContainer" containerID="a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837" Mar 19 00:08:50 crc kubenswrapper[4745]: E0319 00:08:50.150750 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.153786 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.190079 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.190114 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.190123 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.190138 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.190148 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:50Z","lastTransitionTime":"2026-03-19T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.292553 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.292603 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.292613 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.292635 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.292646 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:50Z","lastTransitionTime":"2026-03-19T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.395914 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.395971 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.395986 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.396008 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.396022 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:50Z","lastTransitionTime":"2026-03-19T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.499353 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.499420 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.499439 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.499470 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.499489 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:50Z","lastTransitionTime":"2026-03-19T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.602517 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.602562 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.602571 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.602587 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.602619 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:50Z","lastTransitionTime":"2026-03-19T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.653095 4745 generic.go:334] "Generic (PLEG): container finished" podID="9ddaa87b-caf7-46de-b693-96c60909d05e" containerID="f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a" exitCode=0 Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.653163 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" event={"ID":"9ddaa87b-caf7-46de-b693-96c60909d05e","Type":"ContainerDied","Data":"f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a"} Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.653702 4745 scope.go:117] "RemoveContainer" containerID="a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837" Mar 19 00:08:50 crc kubenswrapper[4745]: E0319 00:08:50.653862 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.667651 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.679955 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.706205 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.709974 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.710044 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.710085 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.710106 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.710119 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:50Z","lastTransitionTime":"2026-03-19T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.722750 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.736804 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.757225 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.771581 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.786168 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.803234 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.815651 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.815698 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.815712 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.815735 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.815751 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:50Z","lastTransitionTime":"2026-03-19T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.823991 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.838862 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.857048 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.872658 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.887364 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.902674 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.919357 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.919399 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.919414 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.919434 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.919447 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:50Z","lastTransitionTime":"2026-03-19T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.021862 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.021917 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.021926 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.021942 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.021953 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:51Z","lastTransitionTime":"2026-03-19T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.125582 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.125648 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.125663 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.125689 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.125705 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:51Z","lastTransitionTime":"2026-03-19T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.136969 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.136991 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.136969 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:51 crc kubenswrapper[4745]: E0319 00:08:51.137112 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.136985 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:51 crc kubenswrapper[4745]: E0319 00:08:51.137349 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:08:51 crc kubenswrapper[4745]: E0319 00:08:51.137336 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:08:51 crc kubenswrapper[4745]: E0319 00:08:51.137447 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.229494 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.229578 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.229604 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.229644 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.229673 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:51Z","lastTransitionTime":"2026-03-19T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.333182 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.333246 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.333258 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.333276 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.333288 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:51Z","lastTransitionTime":"2026-03-19T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.437040 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.437095 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.437107 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.437128 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.437141 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:51Z","lastTransitionTime":"2026-03-19T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.539430 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.539513 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.539541 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.539577 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.539604 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:51Z","lastTransitionTime":"2026-03-19T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.642543 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.643029 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.643046 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.643064 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.643077 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:51Z","lastTransitionTime":"2026-03-19T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.662864 4745 generic.go:334] "Generic (PLEG): container finished" podID="9ddaa87b-caf7-46de-b693-96c60909d05e" containerID="e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41" exitCode=0 Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.662987 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" event={"ID":"9ddaa87b-caf7-46de-b693-96c60909d05e","Type":"ContainerDied","Data":"e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.677037 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.677073 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.706764 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.717674 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.722045 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.736456 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.745106 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.745135 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.745147 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.745165 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.745177 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:51Z","lastTransitionTime":"2026-03-19T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.761272 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.779238 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.797219 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.813384 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.828418 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.841553 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.848796 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.848841 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.848857 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.848901 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.848920 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:51Z","lastTransitionTime":"2026-03-19T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.854949 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.867136 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.882321 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.897298 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.912469 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.927847 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.945324 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.951705 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.951758 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.951773 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.951792 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.951803 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:51Z","lastTransitionTime":"2026-03-19T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.958135 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.969846 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.984141 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.003291 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.017339 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.029870 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.054480 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.058434 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.058499 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.058514 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.058540 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.058555 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:52Z","lastTransitionTime":"2026-03-19T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.067232 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.078803 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.091060 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.100078 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.109574 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.120786 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.157296 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.161775 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.161813 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.161825 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.161839 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.161855 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:52Z","lastTransitionTime":"2026-03-19T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.264936 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.264976 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.264986 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.264999 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.265009 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:52Z","lastTransitionTime":"2026-03-19T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.367624 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.367668 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.367679 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.367700 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.367713 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:52Z","lastTransitionTime":"2026-03-19T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.470289 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.470637 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.470648 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.470666 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.470678 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:52Z","lastTransitionTime":"2026-03-19T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.573299 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.573351 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.573362 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.573379 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.573391 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:52Z","lastTransitionTime":"2026-03-19T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.675828 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.675922 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.675935 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.675949 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.676327 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:52Z","lastTransitionTime":"2026-03-19T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.683694 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" event={"ID":"9ddaa87b-caf7-46de-b693-96c60909d05e","Type":"ContainerStarted","Data":"129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.689055 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.689608 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.689658 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.707110 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.719323 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.720652 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.738267 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.752111 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.764412 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.780290 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.780471 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.780483 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.780505 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.780517 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:52Z","lastTransitionTime":"2026-03-19T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.788661 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.802609 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.814669 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.826100 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.837578 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.838280 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:09:00.838244948 +0000 UTC m=+105.376440079 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.838732 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.850541 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.866865 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.883195 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.884022 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.884212 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.884350 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.884484 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.884627 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:52Z","lastTransitionTime":"2026-03-19T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.896509 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.913702 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.938115 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.938978 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.939115 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.939214 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:00.939186563 +0000 UTC m=+105.477381724 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.939219 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.939504 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.939618 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.939713 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.939401 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.940085 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:00.940067892 +0000 UTC m=+105.478263023 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.939663 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.940265 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.940345 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.940467 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:00.940455284 +0000 UTC m=+105.478650525 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.939753 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.940638 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs podName:5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:00.940617809 +0000 UTC m=+105.478812970 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs") pod "network-metrics-daemon-4r5k5" (UID: "5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.939926 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.940693 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.940714 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.940766 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:00.940750463 +0000 UTC m=+105.478945624 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.958069 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.973841 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.987812 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.987924 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.987954 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.987989 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.988018 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:52Z","lastTransitionTime":"2026-03-19T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.990085 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.006517 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.027726 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.042532 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.058826 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.075398 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.090626 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.090923 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.090994 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.091060 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.091132 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:53Z","lastTransitionTime":"2026-03-19T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.107400 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.137394 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.137488 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:53 crc kubenswrapper[4745]: E0319 00:08:53.137545 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.137394 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.137414 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:53 crc kubenswrapper[4745]: E0319 00:08:53.137727 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:08:53 crc kubenswrapper[4745]: E0319 00:08:53.137927 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:08:53 crc kubenswrapper[4745]: E0319 00:08:53.137971 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.139233 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.158787 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.175330 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.190071 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.194717 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.194763 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.194782 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.194807 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.194825 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:53Z","lastTransitionTime":"2026-03-19T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.204264 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.217660 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.230901 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.297963 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.298000 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.298012 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.298032 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.298046 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:53Z","lastTransitionTime":"2026-03-19T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.400801 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.400864 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.400925 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.400946 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.400959 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:53Z","lastTransitionTime":"2026-03-19T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.504111 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.504469 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.504616 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.504956 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.505163 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:53Z","lastTransitionTime":"2026-03-19T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.608430 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.608743 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.608836 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.608974 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.609071 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:53Z","lastTransitionTime":"2026-03-19T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.712218 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.712264 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.712273 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.712291 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.712303 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:53Z","lastTransitionTime":"2026-03-19T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.815223 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.816055 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.816103 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.816134 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.816149 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:53Z","lastTransitionTime":"2026-03-19T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.918567 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.918631 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.918643 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.918665 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.918678 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:53Z","lastTransitionTime":"2026-03-19T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.022555 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.022614 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.022631 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.022653 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.022665 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:54Z","lastTransitionTime":"2026-03-19T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.125333 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.125390 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.125401 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.125419 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.125435 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:54Z","lastTransitionTime":"2026-03-19T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.228098 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.228143 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.228155 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.228174 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.228187 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:54Z","lastTransitionTime":"2026-03-19T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.330985 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.331043 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.331055 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.331076 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.331088 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:54Z","lastTransitionTime":"2026-03-19T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.434768 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.434841 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.434861 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.434921 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.434942 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:54Z","lastTransitionTime":"2026-03-19T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.538495 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.538571 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.538589 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.538616 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.538636 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:54Z","lastTransitionTime":"2026-03-19T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.644290 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.644366 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.644386 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.644423 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.644466 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:54Z","lastTransitionTime":"2026-03-19T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.701058 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/0.log" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.705148 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850" exitCode=1 Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.705227 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.706646 4745 scope.go:117] "RemoveContainer" containerID="61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.725915 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.739664 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.747817 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.747871 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.748031 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.748071 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.748097 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:54Z","lastTransitionTime":"2026-03-19T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.754957 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.771670 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.784095 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.801171 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.817797 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.836911 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.851734 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.851792 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.851805 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.851826 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.851839 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:54Z","lastTransitionTime":"2026-03-19T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.855870 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.871605 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.906152 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.924195 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.940370 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.953342 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.956788 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.956841 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.956865 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.956942 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.956968 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:54Z","lastTransitionTime":"2026-03-19T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.963794 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.979067 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:08:54Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 00:08:54.136181 6563 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 00:08:54.136243 6563 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 00:08:54.136255 6563 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 00:08:54.136284 6563 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 00:08:54.136311 6563 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 00:08:54.136325 6563 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 00:08:54.136353 6563 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 00:08:54.136376 6563 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 00:08:54.136382 6563 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 00:08:54.136401 6563 factory.go:656] Stopping watch factory\\\\nI0319 00:08:54.136420 6563 ovnkube.go:599] Stopped ovnkube\\\\nI0319 00:08:54.136443 6563 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 00:08:54.136453 6563 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0319 00:08:54.136460 6563 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 00:08:54.136466 6563 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.059446 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.059497 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.059508 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.059528 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.059540 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.136859 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.136984 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:55 crc kubenswrapper[4745]: E0319 00:08:55.137033 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.137118 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.137007 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:55 crc kubenswrapper[4745]: E0319 00:08:55.137215 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:08:55 crc kubenswrapper[4745]: E0319 00:08:55.137290 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:08:55 crc kubenswrapper[4745]: E0319 00:08:55.137462 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.162742 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.162798 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.162819 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.162851 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.162872 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.267338 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.267401 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.267420 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.267449 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.267470 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.370514 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.370548 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.370558 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.370573 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.370583 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.472225 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.472256 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.472263 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.472276 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.472285 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.574740 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.574783 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.574793 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.574810 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.574819 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.677470 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.677906 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.677918 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.677936 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.677954 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.710210 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/0.log" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.712989 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.713374 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.725994 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.736949 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.749186 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.761897 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.780357 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.780411 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.780424 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.780445 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.780459 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.795238 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.828352 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.846052 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.862388 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.868849 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.868900 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.868910 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.868926 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.868937 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.874705 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: E0319 00:08:55.884361 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.886771 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.888350 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.888380 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.888389 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.888405 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.888415 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.897996 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: E0319 00:08:55.900612 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.904786 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.904827 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.904839 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.904857 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.904869 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.917301 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:08:54Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 00:08:54.136181 6563 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 00:08:54.136243 6563 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 00:08:54.136255 6563 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 00:08:54.136284 6563 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 00:08:54.136311 6563 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 00:08:54.136325 6563 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 00:08:54.136353 6563 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 00:08:54.136376 6563 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 00:08:54.136382 6563 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 00:08:54.136401 6563 factory.go:656] Stopping watch factory\\\\nI0319 00:08:54.136420 6563 ovnkube.go:599] Stopped ovnkube\\\\nI0319 00:08:54.136443 6563 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 00:08:54.136453 6563 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0319 00:08:54.136460 6563 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 00:08:54.136466 6563 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: E0319 00:08:55.920555 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.924490 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.924523 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.924532 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.924546 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.924557 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.939815 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: E0319 00:08:55.940071 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.944667 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.944709 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.944724 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.944750 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.944765 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.952746 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: E0319 00:08:55.960855 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: E0319 00:08:55.961074 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.962846 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.962976 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.962998 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.963022 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.963040 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.972128 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.986133 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.065514 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.065579 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.065592 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.065613 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.065625 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:56Z","lastTransitionTime":"2026-03-19T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.157320 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.168940 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.168996 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.169016 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.169039 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.169054 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:56Z","lastTransitionTime":"2026-03-19T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.171158 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.182776 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.197387 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.211283 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.222497 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.235896 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.249870 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.266542 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.271556 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.271586 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.271597 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.271619 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.271637 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:56Z","lastTransitionTime":"2026-03-19T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.281432 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.311282 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.326028 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.340934 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.353240 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.365923 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.374929 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.374985 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.375003 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.375026 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.375040 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:56Z","lastTransitionTime":"2026-03-19T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.390551 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:08:54Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 00:08:54.136181 6563 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 00:08:54.136243 6563 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 00:08:54.136255 6563 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 00:08:54.136284 6563 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 00:08:54.136311 6563 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 00:08:54.136325 6563 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 00:08:54.136353 6563 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 00:08:54.136376 6563 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 00:08:54.136382 6563 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 00:08:54.136401 6563 factory.go:656] Stopping watch factory\\\\nI0319 00:08:54.136420 6563 ovnkube.go:599] Stopped ovnkube\\\\nI0319 00:08:54.136443 6563 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 00:08:54.136453 6563 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0319 00:08:54.136460 6563 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 00:08:54.136466 6563 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.478634 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.478693 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.478715 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.478761 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.478780 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:56Z","lastTransitionTime":"2026-03-19T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.582207 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.582258 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.582268 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.582284 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.582295 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:56Z","lastTransitionTime":"2026-03-19T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.685162 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.685210 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.685222 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.685256 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.685268 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:56Z","lastTransitionTime":"2026-03-19T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.718587 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/1.log" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.719801 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/0.log" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.723964 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b" exitCode=1 Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.724004 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b"} Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.724063 4745 scope.go:117] "RemoveContainer" containerID="61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.725367 4745 scope.go:117] "RemoveContainer" containerID="3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b" Mar 19 00:08:56 crc kubenswrapper[4745]: E0319 00:08:56.725714 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.736717 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.748464 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.762308 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.777454 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.790111 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.790175 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.790194 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.790222 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.790252 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:56Z","lastTransitionTime":"2026-03-19T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.795650 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.808257 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.824186 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.846470 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.860041 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.879797 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.892780 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.892818 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.892828 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.892842 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.892854 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:56Z","lastTransitionTime":"2026-03-19T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.898222 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.914020 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.935706 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:08:54Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 00:08:54.136181 6563 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 00:08:54.136243 6563 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 00:08:54.136255 6563 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 00:08:54.136284 6563 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 00:08:54.136311 6563 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 00:08:54.136325 6563 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 00:08:54.136353 6563 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 00:08:54.136376 6563 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 00:08:54.136382 6563 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 00:08:54.136401 6563 factory.go:656] Stopping watch factory\\\\nI0319 00:08:54.136420 6563 ovnkube.go:599] Stopped ovnkube\\\\nI0319 00:08:54.136443 6563 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 00:08:54.136453 6563 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0319 00:08:54.136460 6563 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 00:08:54.136466 6563 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0319 00:08:55.760702 6707 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.961125 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.977269 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.995458 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.996745 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.996832 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.996857 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.996910 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.996931 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:56Z","lastTransitionTime":"2026-03-19T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.100099 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.100152 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.100163 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.100190 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.100421 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:57Z","lastTransitionTime":"2026-03-19T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.137434 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.137477 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.137546 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.137646 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:57 crc kubenswrapper[4745]: E0319 00:08:57.137634 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:08:57 crc kubenswrapper[4745]: E0319 00:08:57.137769 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:08:57 crc kubenswrapper[4745]: E0319 00:08:57.137864 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:08:57 crc kubenswrapper[4745]: E0319 00:08:57.137977 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.203125 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.203176 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.203184 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.203200 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.203211 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:57Z","lastTransitionTime":"2026-03-19T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.305949 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.306001 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.306012 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.306029 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.306041 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:57Z","lastTransitionTime":"2026-03-19T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.408903 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.409204 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.409305 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.409454 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.409539 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:57Z","lastTransitionTime":"2026-03-19T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.513564 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.513629 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.513705 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.513732 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.513751 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:57Z","lastTransitionTime":"2026-03-19T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.617745 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.617813 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.617831 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.617859 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.617877 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:57Z","lastTransitionTime":"2026-03-19T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.686811 4745 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.722047 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.722548 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.722766 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.722945 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.723078 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:57Z","lastTransitionTime":"2026-03-19T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.732153 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/1.log" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.738432 4745 scope.go:117] "RemoveContainer" containerID="3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b" Mar 19 00:08:57 crc kubenswrapper[4745]: E0319 00:08:57.738720 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.762958 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.787058 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.806034 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.827011 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.827068 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.827087 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.827114 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.827130 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:57Z","lastTransitionTime":"2026-03-19T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.829231 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.861854 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.876410 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.890914 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.911315 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.926820 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.932421 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.932476 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.932488 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.932507 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.932523 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:57Z","lastTransitionTime":"2026-03-19T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.961587 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0319 00:08:55.760702 6707 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.979483 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.996460 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.009989 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.025331 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.035312 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.035361 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.035377 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.035400 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.035418 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:58Z","lastTransitionTime":"2026-03-19T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.038341 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.054348 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.138581 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.138663 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.138691 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.138726 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.138753 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:58Z","lastTransitionTime":"2026-03-19T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.241174 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.241291 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.241314 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.241342 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.241362 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:58Z","lastTransitionTime":"2026-03-19T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.344108 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.344204 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.344230 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.344265 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.344290 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:58Z","lastTransitionTime":"2026-03-19T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.447843 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.448038 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.448081 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.448121 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.448147 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:58Z","lastTransitionTime":"2026-03-19T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.551340 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.551389 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.551407 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.551433 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.551451 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:58Z","lastTransitionTime":"2026-03-19T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.655069 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.655124 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.655141 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.655166 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.655184 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:58Z","lastTransitionTime":"2026-03-19T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.758291 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.758345 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.758365 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.758390 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.758406 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:58Z","lastTransitionTime":"2026-03-19T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.862451 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.862536 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.862561 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.862593 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.862614 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:58Z","lastTransitionTime":"2026-03-19T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.965615 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.965690 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.965709 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.965736 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.965755 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:58Z","lastTransitionTime":"2026-03-19T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.069707 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.069773 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.069790 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.069816 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.069834 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:59Z","lastTransitionTime":"2026-03-19T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.137646 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.137697 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.137807 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.137857 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:59 crc kubenswrapper[4745]: E0319 00:08:59.138004 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:08:59 crc kubenswrapper[4745]: E0319 00:08:59.138161 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:08:59 crc kubenswrapper[4745]: E0319 00:08:59.138439 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:08:59 crc kubenswrapper[4745]: E0319 00:08:59.138674 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.174214 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.174271 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.174289 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.174316 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.174334 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:59Z","lastTransitionTime":"2026-03-19T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.278037 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.278114 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.278139 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.278176 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.278199 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:59Z","lastTransitionTime":"2026-03-19T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.381602 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.381675 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.381696 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.381720 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.381733 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:59Z","lastTransitionTime":"2026-03-19T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.485658 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.485733 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.485761 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.485796 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.485821 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:59Z","lastTransitionTime":"2026-03-19T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.589248 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.589390 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.589403 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.589433 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.589445 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:59Z","lastTransitionTime":"2026-03-19T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.693115 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.693193 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.693212 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.693237 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.693254 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:59Z","lastTransitionTime":"2026-03-19T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.797302 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.797354 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.797367 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.797386 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.797399 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:59Z","lastTransitionTime":"2026-03-19T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.900812 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.900920 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.900939 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.900966 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.900984 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:59Z","lastTransitionTime":"2026-03-19T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.004370 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.004437 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.004454 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.004479 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.004496 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:00Z","lastTransitionTime":"2026-03-19T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.107380 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.107435 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.107448 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.107470 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.107486 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:00Z","lastTransitionTime":"2026-03-19T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.211469 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.211540 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.211553 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.211596 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.211614 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:00Z","lastTransitionTime":"2026-03-19T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.315141 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.315520 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.315766 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.315967 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.316113 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:00Z","lastTransitionTime":"2026-03-19T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.419515 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.420099 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.420111 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.420134 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.420150 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:00Z","lastTransitionTime":"2026-03-19T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.523830 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.523895 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.523908 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.523928 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.523939 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:00Z","lastTransitionTime":"2026-03-19T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.627775 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.627874 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.627956 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.627991 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.628015 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:00Z","lastTransitionTime":"2026-03-19T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.732400 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.732541 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.732561 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.732592 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.732612 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:00Z","lastTransitionTime":"2026-03-19T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.835861 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.835978 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.835997 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.836024 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.836043 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:00Z","lastTransitionTime":"2026-03-19T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.933463 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:09:00 crc kubenswrapper[4745]: E0319 00:09:00.933874 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:09:16.933822026 +0000 UTC m=+121.472017187 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.940857 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.940918 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.940931 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.940949 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.940964 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:00Z","lastTransitionTime":"2026-03-19T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.035161 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.035220 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.035244 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.035263 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.035290 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035385 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035408 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035440 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:17.035426403 +0000 UTC m=+121.573621534 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035442 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035495 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035538 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035561 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035465 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs podName:5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:17.035448564 +0000 UTC m=+121.573643695 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs") pod "network-metrics-daemon-4r5k5" (UID: "5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035715 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:17.035624929 +0000 UTC m=+121.573820100 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035718 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035777 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:17.035754094 +0000 UTC m=+121.573949305 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035801 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035837 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.036049 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:17.036005362 +0000 UTC m=+121.574200523 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.046014 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.046070 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.046084 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.046106 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.046123 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:01Z","lastTransitionTime":"2026-03-19T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.137050 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.137101 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.137217 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.137223 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.137287 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.137343 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.137486 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.137642 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.149408 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.149469 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.149485 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.149506 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.149525 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:01Z","lastTransitionTime":"2026-03-19T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.252261 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.252345 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.252368 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.252397 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.252417 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:01Z","lastTransitionTime":"2026-03-19T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.355294 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.355356 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.355373 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.355399 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.355418 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:01Z","lastTransitionTime":"2026-03-19T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.459400 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.459483 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.459509 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.459542 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.459569 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:01Z","lastTransitionTime":"2026-03-19T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.562668 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.562759 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.562785 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.562817 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.562840 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:01Z","lastTransitionTime":"2026-03-19T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.666663 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.666755 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.666768 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.666790 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.666803 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:01Z","lastTransitionTime":"2026-03-19T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.769616 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.769669 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.769680 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.769697 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.769709 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:01Z","lastTransitionTime":"2026-03-19T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.873532 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.873595 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.873605 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.873626 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.873638 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:01Z","lastTransitionTime":"2026-03-19T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.977738 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.977796 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.977809 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.977829 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.977842 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:01Z","lastTransitionTime":"2026-03-19T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.079981 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.080029 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.080039 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.080059 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.080071 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:02Z","lastTransitionTime":"2026-03-19T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.151980 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.182960 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.183132 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.183163 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.183200 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.183225 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:02Z","lastTransitionTime":"2026-03-19T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.285626 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.285753 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.285764 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.285779 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.285789 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:02Z","lastTransitionTime":"2026-03-19T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.388212 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.388266 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.388279 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.388301 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.388316 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:02Z","lastTransitionTime":"2026-03-19T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.497294 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.497357 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.497376 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.497403 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.497422 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:02Z","lastTransitionTime":"2026-03-19T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.600354 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.600440 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.600464 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.600499 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.600522 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:02Z","lastTransitionTime":"2026-03-19T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.704026 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.704097 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.704114 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.704143 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.704162 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:02Z","lastTransitionTime":"2026-03-19T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.807281 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.807351 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.807370 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.807400 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.807420 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:02Z","lastTransitionTime":"2026-03-19T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.910438 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.910506 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.910519 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.910548 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.910564 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:02Z","lastTransitionTime":"2026-03-19T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.013518 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.013583 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.013596 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.013615 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.013627 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:03Z","lastTransitionTime":"2026-03-19T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.117990 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.118075 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.118102 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.118143 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.118168 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:03Z","lastTransitionTime":"2026-03-19T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.137540 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.137583 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.137555 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:03 crc kubenswrapper[4745]: E0319 00:09:03.137751 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.137803 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:03 crc kubenswrapper[4745]: E0319 00:09:03.138052 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:03 crc kubenswrapper[4745]: E0319 00:09:03.138175 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:03 crc kubenswrapper[4745]: E0319 00:09:03.138323 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.221678 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.221725 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.221740 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.221759 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.221769 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:03Z","lastTransitionTime":"2026-03-19T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.324564 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.324616 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.324629 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.324653 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.324667 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:03Z","lastTransitionTime":"2026-03-19T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.428475 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.428554 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.428565 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.428585 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.428627 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:03Z","lastTransitionTime":"2026-03-19T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.530681 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.530723 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.530735 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.530756 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.530769 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:03Z","lastTransitionTime":"2026-03-19T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.634019 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.634062 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.634073 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.634089 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.634100 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:03Z","lastTransitionTime":"2026-03-19T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.736904 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.737004 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.737022 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.737045 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.737062 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:03Z","lastTransitionTime":"2026-03-19T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.840257 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.840319 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.840331 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.840347 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.840359 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:03Z","lastTransitionTime":"2026-03-19T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.943331 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.943375 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.943383 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.943396 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.943405 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:03Z","lastTransitionTime":"2026-03-19T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.047087 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.047141 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.047165 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.047189 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.047205 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:04Z","lastTransitionTime":"2026-03-19T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.149958 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.150034 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.150051 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.150081 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.150103 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:04Z","lastTransitionTime":"2026-03-19T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.252690 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.252774 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.252819 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.252858 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.252909 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:04Z","lastTransitionTime":"2026-03-19T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.357554 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.357612 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.357623 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.357643 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.357657 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:04Z","lastTransitionTime":"2026-03-19T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.461047 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.461101 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.461117 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.461139 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.461153 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:04Z","lastTransitionTime":"2026-03-19T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.563809 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.563873 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.563905 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.563926 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.563941 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:04Z","lastTransitionTime":"2026-03-19T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.666921 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.666968 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.666997 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.667035 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.667053 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:04Z","lastTransitionTime":"2026-03-19T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.769997 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.770045 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.770059 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.770079 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.770091 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:04Z","lastTransitionTime":"2026-03-19T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.874055 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.874103 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.874111 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.874128 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.874137 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:04Z","lastTransitionTime":"2026-03-19T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.976727 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.976773 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.976783 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.976799 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.976810 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:04Z","lastTransitionTime":"2026-03-19T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.085323 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.085397 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.085412 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.085437 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.085459 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:05Z","lastTransitionTime":"2026-03-19T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.137313 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.137421 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:05 crc kubenswrapper[4745]: E0319 00:09:05.137481 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.137515 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.137334 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:05 crc kubenswrapper[4745]: E0319 00:09:05.137760 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:05 crc kubenswrapper[4745]: E0319 00:09:05.137852 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:05 crc kubenswrapper[4745]: E0319 00:09:05.137956 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.138024 4745 scope.go:117] "RemoveContainer" containerID="a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.188052 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.188088 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.188096 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.188115 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.188151 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:05Z","lastTransitionTime":"2026-03-19T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.291022 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.291067 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.291079 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.291097 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.291108 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:05Z","lastTransitionTime":"2026-03-19T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.394257 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.394330 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.394354 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.394390 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.394414 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:05Z","lastTransitionTime":"2026-03-19T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.497284 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.497332 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.497343 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.497364 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.497377 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:05Z","lastTransitionTime":"2026-03-19T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.600200 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.600252 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.600266 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.600282 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.600292 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:05Z","lastTransitionTime":"2026-03-19T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.703839 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.703921 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.703941 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.703965 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.703986 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:05Z","lastTransitionTime":"2026-03-19T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.770667 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.773416 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.773926 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.792105 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.806603 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.806642 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.806656 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.806677 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.806690 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:05Z","lastTransitionTime":"2026-03-19T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.807109 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.821648 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.842814 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.858631 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.880352 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.904781 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.910088 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.910150 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.910172 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.910202 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.910222 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:05Z","lastTransitionTime":"2026-03-19T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.920701 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.943752 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.967029 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.993366 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.015155 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.015550 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.015581 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.015600 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.015624 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.015641 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.034078 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.070050 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0319 00:08:55.760702 6707 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.119020 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.119075 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.119088 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.119113 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.119127 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.141519 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.169011 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.186721 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.199013 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.212611 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.221014 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.221049 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.221059 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.221080 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.221092 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.225045 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.236032 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.258669 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.270012 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.289480 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.305808 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.321970 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.324354 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.324401 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.324416 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.324440 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.324455 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.327611 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.327639 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.327651 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.327665 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.327674 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.340087 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: E0319 00:09:06.341738 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.345206 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.345238 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.345250 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.345274 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.345286 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.356992 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: E0319 00:09:06.362649 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.366509 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.366562 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.366592 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.366609 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.366622 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.373732 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: E0319 00:09:06.382736 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.387007 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.387051 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.387064 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.387083 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.387095 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.389361 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: E0319 00:09:06.402528 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.403042 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.406506 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.406539 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.406550 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.406567 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.406579 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: E0319 00:09:06.418539 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: E0319 00:09:06.418734 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.423377 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0319 00:08:55.760702 6707 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.432706 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.432751 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.432759 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.432776 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.432786 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.452008 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.472391 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.536348 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.536403 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.536416 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.536436 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.536449 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.640686 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.640816 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.640841 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.640871 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.640928 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.744623 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.744691 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.744710 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.744738 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.744763 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.855563 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.855645 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.855669 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.855705 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.855728 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.960631 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.960714 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.960732 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.960761 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.960780 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.064333 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.064462 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.064488 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.064515 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.064534 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:07Z","lastTransitionTime":"2026-03-19T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.137754 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:07 crc kubenswrapper[4745]: E0319 00:09:07.138007 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.138565 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:07 crc kubenswrapper[4745]: E0319 00:09:07.138633 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.138687 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:07 crc kubenswrapper[4745]: E0319 00:09:07.138742 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.138793 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:07 crc kubenswrapper[4745]: E0319 00:09:07.138854 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.167910 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.167983 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.168008 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.168042 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.168066 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:07Z","lastTransitionTime":"2026-03-19T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.272367 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.272428 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.272447 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.272475 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.272496 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:07Z","lastTransitionTime":"2026-03-19T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.376185 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.376251 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.376270 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.376298 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.376314 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:07Z","lastTransitionTime":"2026-03-19T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.479790 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.479851 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.479863 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.479900 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.479913 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:07Z","lastTransitionTime":"2026-03-19T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.583501 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.583550 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.583563 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.583586 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.583600 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:07Z","lastTransitionTime":"2026-03-19T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.687432 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.687522 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.687542 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.687573 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.687592 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:07Z","lastTransitionTime":"2026-03-19T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.791076 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.791140 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.791158 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.791189 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.791210 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:07Z","lastTransitionTime":"2026-03-19T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.894504 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.894579 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.894599 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.894630 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.894649 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:07Z","lastTransitionTime":"2026-03-19T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.997101 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.997154 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.997166 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.997183 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.997196 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:07Z","lastTransitionTime":"2026-03-19T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.099587 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.099646 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.099658 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.099677 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.099693 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:08Z","lastTransitionTime":"2026-03-19T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.201936 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.201966 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.201974 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.201986 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.202002 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:08Z","lastTransitionTime":"2026-03-19T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.305371 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.305469 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.305496 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.305542 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.305570 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:08Z","lastTransitionTime":"2026-03-19T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.408720 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.408779 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.408793 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.408817 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.408833 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:08Z","lastTransitionTime":"2026-03-19T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.511469 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.511533 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.511550 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.511573 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.511590 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:08Z","lastTransitionTime":"2026-03-19T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.615360 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.615421 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.615436 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.615459 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.615475 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:08Z","lastTransitionTime":"2026-03-19T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.718797 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.718866 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.718900 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.718931 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.718947 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:08Z","lastTransitionTime":"2026-03-19T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.822714 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.822789 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.822813 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.822846 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.822910 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:08Z","lastTransitionTime":"2026-03-19T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.926862 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.927332 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.927425 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.927532 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.927647 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:08Z","lastTransitionTime":"2026-03-19T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.031865 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.031963 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.031987 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.032014 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.032035 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:09Z","lastTransitionTime":"2026-03-19T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.135493 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.135575 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.135601 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.135635 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.135658 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:09Z","lastTransitionTime":"2026-03-19T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.136761 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:09 crc kubenswrapper[4745]: E0319 00:09:09.137129 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.137170 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:09 crc kubenswrapper[4745]: E0319 00:09:09.137546 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.137255 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.137196 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:09 crc kubenswrapper[4745]: E0319 00:09:09.138080 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:09 crc kubenswrapper[4745]: E0319 00:09:09.138091 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.239520 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.239597 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.239615 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.239643 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.239660 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:09Z","lastTransitionTime":"2026-03-19T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.343003 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.343081 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.343108 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.343146 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.343173 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:09Z","lastTransitionTime":"2026-03-19T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.446729 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.446817 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.446843 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.446878 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.446928 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:09Z","lastTransitionTime":"2026-03-19T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.550799 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.550871 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.550915 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.550937 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.550954 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:09Z","lastTransitionTime":"2026-03-19T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.654639 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.654701 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.654713 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.654732 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.654744 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:09Z","lastTransitionTime":"2026-03-19T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.758378 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.758452 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.758471 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.758512 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.758535 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:09Z","lastTransitionTime":"2026-03-19T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.863193 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.863729 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.864076 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.864316 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.864538 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:09Z","lastTransitionTime":"2026-03-19T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.968040 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.968111 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.968137 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.968169 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.968195 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:09Z","lastTransitionTime":"2026-03-19T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.072512 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.072567 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.072579 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.072600 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.072614 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:10Z","lastTransitionTime":"2026-03-19T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.176481 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.176548 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.176570 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.176601 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.176620 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:10Z","lastTransitionTime":"2026-03-19T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.280142 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.280208 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.280226 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.280255 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.280274 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:10Z","lastTransitionTime":"2026-03-19T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.382983 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.383090 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.383132 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.383170 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.383195 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:10Z","lastTransitionTime":"2026-03-19T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.486754 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.486836 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.486859 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.486932 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.486957 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:10Z","lastTransitionTime":"2026-03-19T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.590488 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.590552 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.590566 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.590591 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.590608 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:10Z","lastTransitionTime":"2026-03-19T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.694447 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.694520 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.694554 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.694588 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.694610 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:10Z","lastTransitionTime":"2026-03-19T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.797159 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.797216 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.797230 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.797252 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.797265 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:10Z","lastTransitionTime":"2026-03-19T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.901100 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.901159 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.901180 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.901206 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.901229 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:10Z","lastTransitionTime":"2026-03-19T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.004731 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.005127 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.005231 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.005329 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.005443 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:11Z","lastTransitionTime":"2026-03-19T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.108083 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.108489 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.108593 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.108690 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.108784 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:11Z","lastTransitionTime":"2026-03-19T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.137422 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.137941 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.138141 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.138251 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:11 crc kubenswrapper[4745]: E0319 00:09:11.138159 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:11 crc kubenswrapper[4745]: E0319 00:09:11.138429 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:11 crc kubenswrapper[4745]: E0319 00:09:11.138530 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:11 crc kubenswrapper[4745]: E0319 00:09:11.138782 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.212291 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.212379 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.212400 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.212431 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.212452 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:11Z","lastTransitionTime":"2026-03-19T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.315519 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.316230 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.316370 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.316460 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.316550 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:11Z","lastTransitionTime":"2026-03-19T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.419646 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.419711 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.419726 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.419748 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.419765 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:11Z","lastTransitionTime":"2026-03-19T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.523502 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.523587 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.523609 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.523639 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.523665 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:11Z","lastTransitionTime":"2026-03-19T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.627837 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.627906 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.627917 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.627937 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.627948 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:11Z","lastTransitionTime":"2026-03-19T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.731058 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.731130 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.731146 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.731173 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.731190 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:11Z","lastTransitionTime":"2026-03-19T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.834594 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.834655 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.834666 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.834688 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.834700 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:11Z","lastTransitionTime":"2026-03-19T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.937847 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.937947 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.937960 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.937980 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.937995 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:11Z","lastTransitionTime":"2026-03-19T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.040605 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.040658 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.040673 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.040695 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.040709 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:12Z","lastTransitionTime":"2026-03-19T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.138425 4745 scope.go:117] "RemoveContainer" containerID="3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.143825 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.143923 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.143952 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.143986 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.144007 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:12Z","lastTransitionTime":"2026-03-19T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.248684 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.249243 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.249259 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.249282 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.249298 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:12Z","lastTransitionTime":"2026-03-19T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.352601 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.352644 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.352653 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.352670 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.352679 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:12Z","lastTransitionTime":"2026-03-19T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.456863 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.456931 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.456941 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.456962 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.456974 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:12Z","lastTransitionTime":"2026-03-19T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.579384 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.579432 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.579442 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.579462 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.579473 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:12Z","lastTransitionTime":"2026-03-19T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.682706 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.682757 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.682768 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.682789 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.682801 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:12Z","lastTransitionTime":"2026-03-19T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.785480 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.785521 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.785530 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.785544 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.785554 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:12Z","lastTransitionTime":"2026-03-19T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.805711 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/1.log" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.808790 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.809367 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.825025 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.838623 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.850225 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.864799 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.879087 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.890197 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.890237 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.890246 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.890264 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.890276 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:12Z","lastTransitionTime":"2026-03-19T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.895871 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.910902 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.925948 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.950268 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0319 00:08:55.760702 6707 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.970534 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.983732 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.993278 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.993310 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.993319 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.993342 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.993353 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:12Z","lastTransitionTime":"2026-03-19T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.000068 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.014515 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.027960 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.044982 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.059168 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.071432 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.096191 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.096228 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.096241 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.096265 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.096276 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:13Z","lastTransitionTime":"2026-03-19T00:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.137090 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.137180 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:13 crc kubenswrapper[4745]: E0319 00:09:13.137230 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.137320 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:13 crc kubenswrapper[4745]: E0319 00:09:13.137339 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.137186 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:13 crc kubenswrapper[4745]: E0319 00:09:13.137592 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:13 crc kubenswrapper[4745]: E0319 00:09:13.137785 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.198707 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.198777 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.198799 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.198833 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.198857 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:13Z","lastTransitionTime":"2026-03-19T00:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.302188 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.302246 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.302257 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.302280 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.302293 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:13Z","lastTransitionTime":"2026-03-19T00:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.406131 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.406215 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.406240 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.406270 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.406288 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:13Z","lastTransitionTime":"2026-03-19T00:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.509108 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.509159 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.509172 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.509190 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.509203 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:13Z","lastTransitionTime":"2026-03-19T00:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.611849 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.611925 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.611938 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.611960 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.611972 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:13Z","lastTransitionTime":"2026-03-19T00:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.713827 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.713908 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.713926 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.713948 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.713961 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:13Z","lastTransitionTime":"2026-03-19T00:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.815007 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/2.log" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.816392 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/1.log" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.818075 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.818103 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.818112 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.818126 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.818137 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:13Z","lastTransitionTime":"2026-03-19T00:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.827566 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729" exitCode=1 Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.827642 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.827698 4745 scope.go:117] "RemoveContainer" containerID="3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.829121 4745 scope.go:117] "RemoveContainer" containerID="70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729" Mar 19 00:09:13 crc kubenswrapper[4745]: E0319 00:09:13.829449 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.849843 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.863562 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.874036 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.891377 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0319 00:08:55.760702 6707 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:13Z\\\",\\\"message\\\":\\\" transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0319 00:09:12.991936 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.910480 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.921437 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.921474 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.921483 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.921500 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.921510 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:13Z","lastTransitionTime":"2026-03-19T00:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.922324 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.931460 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.943171 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.952936 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.962914 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.973548 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.983006 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.994174 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.004647 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.015332 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.024603 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.024640 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.024649 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.024665 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.024675 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:14Z","lastTransitionTime":"2026-03-19T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.029152 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.041126 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.127667 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.127724 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.127735 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.127751 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.127779 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:14Z","lastTransitionTime":"2026-03-19T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.230033 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.230080 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.230088 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.230105 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.230115 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:14Z","lastTransitionTime":"2026-03-19T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.333821 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.333873 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.333900 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.333919 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.333935 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:14Z","lastTransitionTime":"2026-03-19T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.437049 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.437082 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.437092 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.437105 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.437115 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:14Z","lastTransitionTime":"2026-03-19T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.539926 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.539956 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.539966 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.539984 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.539995 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:14Z","lastTransitionTime":"2026-03-19T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.642815 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.642845 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.642856 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.642897 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.642913 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:14Z","lastTransitionTime":"2026-03-19T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.746357 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.746401 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.746412 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.746431 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.746443 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:14Z","lastTransitionTime":"2026-03-19T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.833173 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/2.log" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.837852 4745 scope.go:117] "RemoveContainer" containerID="70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729" Mar 19 00:09:14 crc kubenswrapper[4745]: E0319 00:09:14.838062 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.850163 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.850212 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.850232 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.850252 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.850264 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:14Z","lastTransitionTime":"2026-03-19T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.853523 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.871121 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.890228 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.909391 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.927423 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.951670 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.954469 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.954649 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.954670 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.954703 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.954739 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:14Z","lastTransitionTime":"2026-03-19T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.970217 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.988213 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.014035 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:13Z\\\",\\\"message\\\":\\\" transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0319 00:09:12.991936 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.050303 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.057965 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.058022 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.058039 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.058059 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.058076 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:15Z","lastTransitionTime":"2026-03-19T00:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.072915 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.089688 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.108636 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.118417 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.124197 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.136957 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:15 crc kubenswrapper[4745]: E0319 00:09:15.137157 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.136959 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:15 crc kubenswrapper[4745]: E0319 00:09:15.137300 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.136958 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:15 crc kubenswrapper[4745]: E0319 00:09:15.137406 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.138252 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:15 crc kubenswrapper[4745]: E0319 00:09:15.138417 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.142095 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.159670 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.162023 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.162069 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.162080 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.162100 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.162113 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:15Z","lastTransitionTime":"2026-03-19T00:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.184165 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.205189 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.219346 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.235218 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.251134 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.265155 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.265192 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.265202 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.265218 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.265229 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:15Z","lastTransitionTime":"2026-03-19T00:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.267684 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.284981 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.303011 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.318093 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.340554 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.354140 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.367826 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.367953 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.367984 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.368023 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.368044 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:15Z","lastTransitionTime":"2026-03-19T00:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.377553 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.390233 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.402279 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.423126 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:13Z\\\",\\\"message\\\":\\\" transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0319 00:09:12.991936 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.438446 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.453301 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.467992 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.471014 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.471068 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.471088 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.471113 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.471130 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:15Z","lastTransitionTime":"2026-03-19T00:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.574453 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.574500 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.574513 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.574532 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.574543 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:15Z","lastTransitionTime":"2026-03-19T00:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.677456 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.677523 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.677540 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.677564 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.677577 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:15Z","lastTransitionTime":"2026-03-19T00:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.780868 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.780942 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.780954 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.780976 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.780991 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:15Z","lastTransitionTime":"2026-03-19T00:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.884609 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.884654 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.884665 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.884685 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.884699 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:15Z","lastTransitionTime":"2026-03-19T00:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.987343 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.987395 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.987405 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.987425 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.987460 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:15Z","lastTransitionTime":"2026-03-19T00:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:16 crc kubenswrapper[4745]: E0319 00:09:16.088540 4745 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.157294 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.173837 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.195966 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.211761 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.224806 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.249025 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:13Z\\\",\\\"message\\\":\\\" transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0319 00:09:12.991936 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: E0319 00:09:16.251616 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.274439 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.287539 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.300956 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.311388 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.323060 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.338099 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.351769 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.365801 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.380391 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.393936 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.407441 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.652930 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.653019 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.653042 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.653078 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.653107 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:16Z","lastTransitionTime":"2026-03-19T00:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:16 crc kubenswrapper[4745]: E0319 00:09:16.669345 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.674439 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.674517 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.674535 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.674560 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.674576 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:16Z","lastTransitionTime":"2026-03-19T00:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:16 crc kubenswrapper[4745]: E0319 00:09:16.690931 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.696119 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.696205 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.696230 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.696264 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.696290 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:16Z","lastTransitionTime":"2026-03-19T00:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:16 crc kubenswrapper[4745]: E0319 00:09:16.716950 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.722274 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.722317 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.722330 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.722353 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.722367 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:16Z","lastTransitionTime":"2026-03-19T00:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:16 crc kubenswrapper[4745]: E0319 00:09:16.737735 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.743254 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.743316 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.743332 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.743354 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.743369 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:16Z","lastTransitionTime":"2026-03-19T00:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:16 crc kubenswrapper[4745]: E0319 00:09:16.761098 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: E0319 00:09:16.761276 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.955342 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:09:16 crc kubenswrapper[4745]: E0319 00:09:16.955561 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:09:48.955529704 +0000 UTC m=+153.493724835 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:09:17 crc kubenswrapper[4745]: I0319 00:09:17.057291 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:17 crc kubenswrapper[4745]: I0319 00:09:17.057358 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:17 crc kubenswrapper[4745]: I0319 00:09:17.057459 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:17 crc kubenswrapper[4745]: I0319 00:09:17.057488 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:17 crc kubenswrapper[4745]: I0319 00:09:17.057516 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.057553 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.057745 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.057773 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.057792 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.057839 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:49.057808393 +0000 UTC m=+153.596003564 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.057871 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:49.057857564 +0000 UTC m=+153.596052735 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.057916 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.058013 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.058032 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.058044 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.058013 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs podName:5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:49.057995499 +0000 UTC m=+153.596190660 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs") pod "network-metrics-daemon-4r5k5" (UID: "5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.058093 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:49.058079642 +0000 UTC m=+153.596274793 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.057637 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.058139 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:49.058128343 +0000 UTC m=+153.596323494 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:09:17 crc kubenswrapper[4745]: I0319 00:09:17.137702 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:17 crc kubenswrapper[4745]: I0319 00:09:17.137745 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.137913 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:17 crc kubenswrapper[4745]: I0319 00:09:17.137943 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:17 crc kubenswrapper[4745]: I0319 00:09:17.138026 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.138138 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.138245 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.138345 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:19 crc kubenswrapper[4745]: I0319 00:09:19.137089 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:19 crc kubenswrapper[4745]: I0319 00:09:19.137169 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:19 crc kubenswrapper[4745]: I0319 00:09:19.137091 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:19 crc kubenswrapper[4745]: E0319 00:09:19.137328 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:19 crc kubenswrapper[4745]: I0319 00:09:19.137112 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:19 crc kubenswrapper[4745]: E0319 00:09:19.137242 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:19 crc kubenswrapper[4745]: E0319 00:09:19.137533 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:19 crc kubenswrapper[4745]: E0319 00:09:19.137580 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:21 crc kubenswrapper[4745]: I0319 00:09:21.137055 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:21 crc kubenswrapper[4745]: I0319 00:09:21.137097 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:21 crc kubenswrapper[4745]: I0319 00:09:21.137102 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:21 crc kubenswrapper[4745]: I0319 00:09:21.137059 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:21 crc kubenswrapper[4745]: E0319 00:09:21.137284 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:21 crc kubenswrapper[4745]: E0319 00:09:21.137482 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:21 crc kubenswrapper[4745]: E0319 00:09:21.137552 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:21 crc kubenswrapper[4745]: E0319 00:09:21.137618 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:21 crc kubenswrapper[4745]: E0319 00:09:21.253438 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:09:23 crc kubenswrapper[4745]: I0319 00:09:23.137473 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:23 crc kubenswrapper[4745]: I0319 00:09:23.137576 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:23 crc kubenswrapper[4745]: I0319 00:09:23.137589 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:23 crc kubenswrapper[4745]: E0319 00:09:23.137637 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:23 crc kubenswrapper[4745]: I0319 00:09:23.137823 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:23 crc kubenswrapper[4745]: E0319 00:09:23.137825 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:23 crc kubenswrapper[4745]: E0319 00:09:23.138025 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:23 crc kubenswrapper[4745]: E0319 00:09:23.138136 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:25 crc kubenswrapper[4745]: I0319 00:09:25.136784 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:25 crc kubenswrapper[4745]: E0319 00:09:25.137695 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:25 crc kubenswrapper[4745]: I0319 00:09:25.137057 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:25 crc kubenswrapper[4745]: E0319 00:09:25.138015 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:25 crc kubenswrapper[4745]: I0319 00:09:25.137059 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:25 crc kubenswrapper[4745]: E0319 00:09:25.138224 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:25 crc kubenswrapper[4745]: I0319 00:09:25.137087 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:25 crc kubenswrapper[4745]: E0319 00:09:25.138446 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.158466 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.177588 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.194758 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.213906 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.227853 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.244078 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: E0319 00:09:26.254389 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.269818 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.283204 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.295278 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.310175 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.326231 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.343647 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.391349 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.415440 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.429622 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.445609 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.462030 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:13Z\\\",\\\"message\\\":\\\" transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0319 00:09:12.991936 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.131532 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.131583 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.131597 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.131618 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.131633 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:27Z","lastTransitionTime":"2026-03-19T00:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.137167 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.137215 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.137288 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:27 crc kubenswrapper[4745]: E0319 00:09:27.137498 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.137606 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:27 crc kubenswrapper[4745]: E0319 00:09:27.137769 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:27 crc kubenswrapper[4745]: E0319 00:09:27.137807 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:27 crc kubenswrapper[4745]: E0319 00:09:27.137917 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:27 crc kubenswrapper[4745]: E0319 00:09:27.146088 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.150136 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.150175 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.150184 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.150200 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.150210 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:27Z","lastTransitionTime":"2026-03-19T00:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:27 crc kubenswrapper[4745]: E0319 00:09:27.161955 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.165373 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.165417 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.165426 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.165446 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.165458 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:27Z","lastTransitionTime":"2026-03-19T00:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:27 crc kubenswrapper[4745]: E0319 00:09:27.176263 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.179167 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.179199 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.179208 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.179223 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.179233 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:27Z","lastTransitionTime":"2026-03-19T00:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:27 crc kubenswrapper[4745]: E0319 00:09:27.191198 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.195370 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.195413 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.195425 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.195444 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.195460 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:27Z","lastTransitionTime":"2026-03-19T00:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:27 crc kubenswrapper[4745]: E0319 00:09:27.206593 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:27 crc kubenswrapper[4745]: E0319 00:09:27.206819 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 00:09:29 crc kubenswrapper[4745]: I0319 00:09:29.137531 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:29 crc kubenswrapper[4745]: I0319 00:09:29.137588 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:29 crc kubenswrapper[4745]: I0319 00:09:29.137626 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:29 crc kubenswrapper[4745]: E0319 00:09:29.137750 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:29 crc kubenswrapper[4745]: I0319 00:09:29.137861 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:29 crc kubenswrapper[4745]: E0319 00:09:29.138122 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:29 crc kubenswrapper[4745]: E0319 00:09:29.138164 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:29 crc kubenswrapper[4745]: E0319 00:09:29.138352 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:30 crc kubenswrapper[4745]: I0319 00:09:30.137752 4745 scope.go:117] "RemoveContainer" containerID="70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729" Mar 19 00:09:30 crc kubenswrapper[4745]: E0319 00:09:30.138512 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" Mar 19 00:09:31 crc kubenswrapper[4745]: I0319 00:09:31.137424 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:31 crc kubenswrapper[4745]: I0319 00:09:31.137530 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:31 crc kubenswrapper[4745]: I0319 00:09:31.138182 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:31 crc kubenswrapper[4745]: I0319 00:09:31.138411 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:31 crc kubenswrapper[4745]: E0319 00:09:31.138365 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:31 crc kubenswrapper[4745]: E0319 00:09:31.138567 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:31 crc kubenswrapper[4745]: E0319 00:09:31.138694 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:31 crc kubenswrapper[4745]: E0319 00:09:31.139272 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:31 crc kubenswrapper[4745]: I0319 00:09:31.152060 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 19 00:09:31 crc kubenswrapper[4745]: E0319 00:09:31.256740 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:09:32 crc kubenswrapper[4745]: I0319 00:09:32.905108 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mlwp7_6a0ae9c0-f19a-4038-be03-0fa6d223ebbf/kube-multus/0.log" Mar 19 00:09:32 crc kubenswrapper[4745]: I0319 00:09:32.905165 4745 generic.go:334] "Generic (PLEG): container finished" podID="6a0ae9c0-f19a-4038-be03-0fa6d223ebbf" containerID="7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2" exitCode=1 Mar 19 00:09:32 crc kubenswrapper[4745]: I0319 00:09:32.905196 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mlwp7" event={"ID":"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf","Type":"ContainerDied","Data":"7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2"} Mar 19 00:09:32 crc kubenswrapper[4745]: I0319 00:09:32.905724 4745 scope.go:117] "RemoveContainer" containerID="7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2" Mar 19 00:09:32 crc kubenswrapper[4745]: I0319 00:09:32.921513 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:32Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:32 crc kubenswrapper[4745]: I0319 00:09:32.936625 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:32Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:32 crc kubenswrapper[4745]: I0319 00:09:32.947184 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:32Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:32 crc kubenswrapper[4745]: I0319 00:09:32.970099 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:32Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:32 crc kubenswrapper[4745]: I0319 00:09:32.981340 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:32Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:32 crc kubenswrapper[4745]: I0319 00:09:32.992559 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:32Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.006609 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:32Z\\\",\\\"message\\\":\\\"2026-03-19T00:08:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb\\\\n2026-03-19T00:08:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb to /host/opt/cni/bin/\\\\n2026-03-19T00:08:47Z [verbose] multus-daemon started\\\\n2026-03-19T00:08:47Z [verbose] Readiness Indicator file check\\\\n2026-03-19T00:09:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.018355 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ff57408-3991-436d-a939-3948f686285a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b1e8257c05a50ecf277d4c2c8f7a4d5a5607ab9d22034aa9eca904fd8fad008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://769ed961b9d7f88a7b4417bc5a6c3f979cb0025b8b9fde3490b3af2170038f0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d5ed9af1438fc22d155095c9b2e08c97ea91980ed0beeaf490b6112ee5196a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.036376 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.048914 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.061925 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.076739 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.094094 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.104763 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.116667 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.126931 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.136846 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.136913 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.136913 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:33 crc kubenswrapper[4745]: E0319 00:09:33.137066 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.136935 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:33 crc kubenswrapper[4745]: E0319 00:09:33.137144 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.136939 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:33 crc kubenswrapper[4745]: E0319 00:09:33.137208 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:33 crc kubenswrapper[4745]: E0319 00:09:33.137283 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.160681 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:13Z\\\",\\\"message\\\":\\\" transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0319 00:09:12.991936 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.913210 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mlwp7_6a0ae9c0-f19a-4038-be03-0fa6d223ebbf/kube-multus/0.log" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.913309 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mlwp7" event={"ID":"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf","Type":"ContainerStarted","Data":"486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915"} Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.935591 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.949995 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.964056 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.978389 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.992864 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:32Z\\\",\\\"message\\\":\\\"2026-03-19T00:08:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb\\\\n2026-03-19T00:08:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb to /host/opt/cni/bin/\\\\n2026-03-19T00:08:47Z [verbose] multus-daemon started\\\\n2026-03-19T00:08:47Z [verbose] Readiness Indicator file check\\\\n2026-03-19T00:09:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.004259 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ff57408-3991-436d-a939-3948f686285a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b1e8257c05a50ecf277d4c2c8f7a4d5a5607ab9d22034aa9eca904fd8fad008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://769ed961b9d7f88a7b4417bc5a6c3f979cb0025b8b9fde3490b3af2170038f0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d5ed9af1438fc22d155095c9b2e08c97ea91980ed0beeaf490b6112ee5196a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.016651 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.028644 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.041293 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.064320 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:13Z\\\",\\\"message\\\":\\\" transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0319 00:09:12.991936 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.084177 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.095611 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.105481 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.119018 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.131294 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.143118 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.156432 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.168976 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:35 crc kubenswrapper[4745]: I0319 00:09:35.137127 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:35 crc kubenswrapper[4745]: I0319 00:09:35.137127 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:35 crc kubenswrapper[4745]: E0319 00:09:35.137302 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:35 crc kubenswrapper[4745]: I0319 00:09:35.137147 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:35 crc kubenswrapper[4745]: E0319 00:09:35.137365 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:35 crc kubenswrapper[4745]: I0319 00:09:35.137144 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:35 crc kubenswrapper[4745]: E0319 00:09:35.137450 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:35 crc kubenswrapper[4745]: E0319 00:09:35.137557 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.153063 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ff57408-3991-436d-a939-3948f686285a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b1e8257c05a50ecf277d4c2c8f7a4d5a5607ab9d22034aa9eca904fd8fad008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://769ed961b9d7f88a7b4417bc5a6c3f979cb0025b8b9fde3490b3af2170038f0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d5ed9af1438fc22d155095c9b2e08c97ea91980ed0beeaf490b6112ee5196a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.171224 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.190672 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.206035 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.223791 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.238112 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:32Z\\\",\\\"message\\\":\\\"2026-03-19T00:08:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb\\\\n2026-03-19T00:08:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb to /host/opt/cni/bin/\\\\n2026-03-19T00:08:47Z [verbose] multus-daemon started\\\\n2026-03-19T00:08:47Z [verbose] Readiness Indicator file check\\\\n2026-03-19T00:09:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: E0319 00:09:36.257137 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.264979 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.281025 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.293820 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.307413 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.319222 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.343959 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:13Z\\\",\\\"message\\\":\\\" transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0319 00:09:12.991936 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.358186 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.373054 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.386734 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.399896 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.412432 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.424599 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.137055 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.137087 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.137071 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.137071 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:37 crc kubenswrapper[4745]: E0319 00:09:37.137202 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:37 crc kubenswrapper[4745]: E0319 00:09:37.137304 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:37 crc kubenswrapper[4745]: E0319 00:09:37.137353 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:37 crc kubenswrapper[4745]: E0319 00:09:37.137651 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.534915 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.534952 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.534961 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.534976 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.534988 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:37Z","lastTransitionTime":"2026-03-19T00:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:37 crc kubenswrapper[4745]: E0319 00:09:37.555794 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.561514 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.561590 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.561608 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.561633 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.561652 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:37Z","lastTransitionTime":"2026-03-19T00:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:37 crc kubenswrapper[4745]: E0319 00:09:37.584782 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.590407 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.590471 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.590529 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.590549 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.590563 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:37Z","lastTransitionTime":"2026-03-19T00:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:37 crc kubenswrapper[4745]: E0319 00:09:37.614048 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.619251 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.619304 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.619318 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.619341 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.619355 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:37Z","lastTransitionTime":"2026-03-19T00:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:37 crc kubenswrapper[4745]: E0319 00:09:37.641413 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.646780 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.646870 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.646944 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.646987 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.647010 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:37Z","lastTransitionTime":"2026-03-19T00:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:37 crc kubenswrapper[4745]: E0319 00:09:37.665029 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:37 crc kubenswrapper[4745]: E0319 00:09:37.665166 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 00:09:39 crc kubenswrapper[4745]: I0319 00:09:39.137159 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:39 crc kubenswrapper[4745]: I0319 00:09:39.137292 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:39 crc kubenswrapper[4745]: E0319 00:09:39.137386 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:39 crc kubenswrapper[4745]: I0319 00:09:39.137187 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:39 crc kubenswrapper[4745]: I0319 00:09:39.137187 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:39 crc kubenswrapper[4745]: E0319 00:09:39.137448 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:39 crc kubenswrapper[4745]: E0319 00:09:39.137595 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:39 crc kubenswrapper[4745]: E0319 00:09:39.137914 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:41 crc kubenswrapper[4745]: I0319 00:09:41.137296 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:41 crc kubenswrapper[4745]: I0319 00:09:41.137369 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:41 crc kubenswrapper[4745]: I0319 00:09:41.137414 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:41 crc kubenswrapper[4745]: I0319 00:09:41.137601 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:41 crc kubenswrapper[4745]: E0319 00:09:41.137701 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:41 crc kubenswrapper[4745]: E0319 00:09:41.137841 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:41 crc kubenswrapper[4745]: E0319 00:09:41.137964 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:41 crc kubenswrapper[4745]: E0319 00:09:41.138154 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:41 crc kubenswrapper[4745]: E0319 00:09:41.259024 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:09:42 crc kubenswrapper[4745]: I0319 00:09:42.139368 4745 scope.go:117] "RemoveContainer" containerID="70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729" Mar 19 00:09:42 crc kubenswrapper[4745]: I0319 00:09:42.948063 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/2.log" Mar 19 00:09:42 crc kubenswrapper[4745]: I0319 00:09:42.952242 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6"} Mar 19 00:09:42 crc kubenswrapper[4745]: I0319 00:09:42.952693 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:09:42 crc kubenswrapper[4745]: I0319 00:09:42.969461 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:42Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:42 crc kubenswrapper[4745]: I0319 00:09:42.983029 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:42Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:42 crc kubenswrapper[4745]: I0319 00:09:42.994927 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:42Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.008259 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.020863 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.032680 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.050100 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.069822 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:32Z\\\",\\\"message\\\":\\\"2026-03-19T00:08:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb\\\\n2026-03-19T00:08:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb to /host/opt/cni/bin/\\\\n2026-03-19T00:08:47Z [verbose] multus-daemon started\\\\n2026-03-19T00:08:47Z [verbose] Readiness Indicator file check\\\\n2026-03-19T00:09:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.084931 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ff57408-3991-436d-a939-3948f686285a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b1e8257c05a50ecf277d4c2c8f7a4d5a5607ab9d22034aa9eca904fd8fad008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://769ed961b9d7f88a7b4417bc5a6c3f979cb0025b8b9fde3490b3af2170038f0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d5ed9af1438fc22d155095c9b2e08c97ea91980ed0beeaf490b6112ee5196a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.100147 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.118343 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.132034 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.136728 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.136835 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.136748 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:43 crc kubenswrapper[4745]: E0319 00:09:43.136965 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:43 crc kubenswrapper[4745]: E0319 00:09:43.137064 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.137119 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:43 crc kubenswrapper[4745]: E0319 00:09:43.137222 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:43 crc kubenswrapper[4745]: E0319 00:09:43.137362 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.151918 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:13Z\\\",\\\"message\\\":\\\" transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0319 00:09:12.991936 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.183900 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.197583 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.213727 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.232909 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.247026 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.957101 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/3.log" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.958138 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/2.log" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.961353 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" exitCode=1 Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.961388 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6"} Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.961424 4745 scope.go:117] "RemoveContainer" containerID="70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.962539 4745 scope.go:117] "RemoveContainer" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:09:43 crc kubenswrapper[4745]: E0319 00:09:43.962806 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.978393 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.995757 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.013667 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.033072 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.051159 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.066690 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.088615 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.106096 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:32Z\\\",\\\"message\\\":\\\"2026-03-19T00:08:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb\\\\n2026-03-19T00:08:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb to /host/opt/cni/bin/\\\\n2026-03-19T00:08:47Z [verbose] multus-daemon started\\\\n2026-03-19T00:08:47Z [verbose] Readiness Indicator file check\\\\n2026-03-19T00:09:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.123287 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ff57408-3991-436d-a939-3948f686285a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b1e8257c05a50ecf277d4c2c8f7a4d5a5607ab9d22034aa9eca904fd8fad008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://769ed961b9d7f88a7b4417bc5a6c3f979cb0025b8b9fde3490b3af2170038f0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d5ed9af1438fc22d155095c9b2e08c97ea91980ed0beeaf490b6112ee5196a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.135257 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.145834 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.155254 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.163938 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.180681 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:13Z\\\",\\\"message\\\":\\\" transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0319 00:09:12.991936 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:42Z\\\",\\\"message\\\":\\\"twork controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:42Z is after 2025-08-24T17:21:41Z]\\\\nI0319 00:09:42.968627 7272 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"d71b38eb-32af-4c0f-9490-7c317c111e3a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]st\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.202861 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.218133 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.227983 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.241796 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.967149 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/3.log" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.971017 4745 scope.go:117] "RemoveContainer" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:09:44 crc kubenswrapper[4745]: E0319 00:09:44.971215 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.987675 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.000338 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.012111 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.024702 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.035383 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.045738 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.060431 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:32Z\\\",\\\"message\\\":\\\"2026-03-19T00:08:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb\\\\n2026-03-19T00:08:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb to /host/opt/cni/bin/\\\\n2026-03-19T00:08:47Z [verbose] multus-daemon started\\\\n2026-03-19T00:08:47Z [verbose] Readiness Indicator file check\\\\n2026-03-19T00:09:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.072363 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ff57408-3991-436d-a939-3948f686285a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b1e8257c05a50ecf277d4c2c8f7a4d5a5607ab9d22034aa9eca904fd8fad008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://769ed961b9d7f88a7b4417bc5a6c3f979cb0025b8b9fde3490b3af2170038f0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d5ed9af1438fc22d155095c9b2e08c97ea91980ed0beeaf490b6112ee5196a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.091379 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.106977 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.122165 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.136312 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.137346 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.137403 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.137357 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:45 crc kubenswrapper[4745]: E0319 00:09:45.137523 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.137348 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:45 crc kubenswrapper[4745]: E0319 00:09:45.137821 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:45 crc kubenswrapper[4745]: E0319 00:09:45.138054 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:45 crc kubenswrapper[4745]: E0319 00:09:45.138128 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.176315 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.193298 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.220098 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.230698 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.240450 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.258004 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:42Z\\\",\\\"message\\\":\\\"twork controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:42Z is after 2025-08-24T17:21:41Z]\\\\nI0319 00:09:42.968627 7272 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"d71b38eb-32af-4c0f-9490-7c317c111e3a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]st\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.161098 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.183326 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.200462 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.220076 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.234965 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: E0319 00:09:46.259665 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.261076 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.284143 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:32Z\\\",\\\"message\\\":\\\"2026-03-19T00:08:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb\\\\n2026-03-19T00:08:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb to /host/opt/cni/bin/\\\\n2026-03-19T00:08:47Z [verbose] multus-daemon started\\\\n2026-03-19T00:08:47Z [verbose] Readiness Indicator file check\\\\n2026-03-19T00:09:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.302084 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ff57408-3991-436d-a939-3948f686285a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b1e8257c05a50ecf277d4c2c8f7a4d5a5607ab9d22034aa9eca904fd8fad008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://769ed961b9d7f88a7b4417bc5a6c3f979cb0025b8b9fde3490b3af2170038f0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d5ed9af1438fc22d155095c9b2e08c97ea91980ed0beeaf490b6112ee5196a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.323608 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.339174 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.350446 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.371214 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:42Z\\\",\\\"message\\\":\\\"twork controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:42Z is after 2025-08-24T17:21:41Z]\\\\nI0319 00:09:42.968627 7272 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"d71b38eb-32af-4c0f-9490-7c317c111e3a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]st\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.394092 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.407375 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.419068 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.432742 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.445844 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.458015 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.137529 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.137582 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:47 crc kubenswrapper[4745]: E0319 00:09:47.137706 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.137713 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.137781 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:47 crc kubenswrapper[4745]: E0319 00:09:47.137905 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:47 crc kubenswrapper[4745]: E0319 00:09:47.138066 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:47 crc kubenswrapper[4745]: E0319 00:09:47.138155 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.969916 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.969953 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.969962 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.969997 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.970008 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:47Z","lastTransitionTime":"2026-03-19T00:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:47 crc kubenswrapper[4745]: E0319 00:09:47.985782 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.990578 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.990649 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.990660 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.990679 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.990690 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:47Z","lastTransitionTime":"2026-03-19T00:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:48 crc kubenswrapper[4745]: E0319 00:09:48.014537 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.020423 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.020488 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.020501 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.020524 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.020541 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:48Z","lastTransitionTime":"2026-03-19T00:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:48 crc kubenswrapper[4745]: E0319 00:09:48.041714 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.046758 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.046808 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.046822 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.046840 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.046854 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:48Z","lastTransitionTime":"2026-03-19T00:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:48 crc kubenswrapper[4745]: E0319 00:09:48.088180 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.093452 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.093503 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.093529 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.093553 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.093573 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:48Z","lastTransitionTime":"2026-03-19T00:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:48 crc kubenswrapper[4745]: E0319 00:09:48.109580 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:48 crc kubenswrapper[4745]: E0319 00:09:48.110100 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 00:09:49 crc kubenswrapper[4745]: I0319 00:09:49.023009 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.023293 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:53.023253587 +0000 UTC m=+217.561448718 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:09:49 crc kubenswrapper[4745]: I0319 00:09:49.124272 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:49 crc kubenswrapper[4745]: I0319 00:09:49.125973 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.124531 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:09:49 crc kubenswrapper[4745]: I0319 00:09:49.126109 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126086 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126309 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:10:53.126253185 +0000 UTC m=+217.664448446 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126365 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:09:49 crc kubenswrapper[4745]: I0319 00:09:49.126376 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126402 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126514 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 00:10:53.126487463 +0000 UTC m=+217.664682754 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:49 crc kubenswrapper[4745]: I0319 00:09:49.126598 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126677 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126701 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126717 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126771 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126783 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 00:10:53.126765362 +0000 UTC m=+217.664960653 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126857 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:10:53.126835344 +0000 UTC m=+217.665030495 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.127135 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.127433 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs podName:5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7 nodeName:}" failed. No retries permitted until 2026-03-19 00:10:53.127276889 +0000 UTC m=+217.665472150 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs") pod "network-metrics-daemon-4r5k5" (UID: "5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:09:49 crc kubenswrapper[4745]: I0319 00:09:49.136808 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:49 crc kubenswrapper[4745]: I0319 00:09:49.136952 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:49 crc kubenswrapper[4745]: I0319 00:09:49.136924 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:49 crc kubenswrapper[4745]: I0319 00:09:49.136852 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.137284 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.137404 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.137570 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.137775 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:51 crc kubenswrapper[4745]: I0319 00:09:51.137337 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:51 crc kubenswrapper[4745]: I0319 00:09:51.137443 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:51 crc kubenswrapper[4745]: E0319 00:09:51.137523 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:51 crc kubenswrapper[4745]: I0319 00:09:51.137348 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:51 crc kubenswrapper[4745]: E0319 00:09:51.137630 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:51 crc kubenswrapper[4745]: E0319 00:09:51.137788 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:51 crc kubenswrapper[4745]: I0319 00:09:51.137958 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:51 crc kubenswrapper[4745]: E0319 00:09:51.138053 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:51 crc kubenswrapper[4745]: E0319 00:09:51.260985 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:09:53 crc kubenswrapper[4745]: I0319 00:09:53.137100 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:53 crc kubenswrapper[4745]: I0319 00:09:53.137157 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:53 crc kubenswrapper[4745]: E0319 00:09:53.137234 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:53 crc kubenswrapper[4745]: I0319 00:09:53.137245 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:53 crc kubenswrapper[4745]: I0319 00:09:53.137331 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:53 crc kubenswrapper[4745]: E0319 00:09:53.137381 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:53 crc kubenswrapper[4745]: E0319 00:09:53.137461 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:53 crc kubenswrapper[4745]: E0319 00:09:53.137553 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:55 crc kubenswrapper[4745]: I0319 00:09:55.137896 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:55 crc kubenswrapper[4745]: I0319 00:09:55.137948 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:55 crc kubenswrapper[4745]: I0319 00:09:55.137977 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:55 crc kubenswrapper[4745]: I0319 00:09:55.137875 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:55 crc kubenswrapper[4745]: E0319 00:09:55.138067 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:55 crc kubenswrapper[4745]: E0319 00:09:55.138174 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:55 crc kubenswrapper[4745]: E0319 00:09:55.138306 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:55 crc kubenswrapper[4745]: E0319 00:09:55.138462 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.157928 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.172215 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.184137 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.193684 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.203924 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.221424 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:42Z\\\",\\\"message\\\":\\\"twork controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:42Z is after 2025-08-24T17:21:41Z]\\\\nI0319 00:09:42.968627 7272 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"d71b38eb-32af-4c0f-9490-7c317c111e3a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]st\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.236729 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.248234 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.260433 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: E0319 00:09:56.262155 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.273515 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.286984 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.299132 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.310298 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:32Z\\\",\\\"message\\\":\\\"2026-03-19T00:08:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb\\\\n2026-03-19T00:08:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb to /host/opt/cni/bin/\\\\n2026-03-19T00:08:47Z [verbose] multus-daemon started\\\\n2026-03-19T00:08:47Z [verbose] Readiness Indicator file check\\\\n2026-03-19T00:09:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.320304 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ff57408-3991-436d-a939-3948f686285a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b1e8257c05a50ecf277d4c2c8f7a4d5a5607ab9d22034aa9eca904fd8fad008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://769ed961b9d7f88a7b4417bc5a6c3f979cb0025b8b9fde3490b3af2170038f0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d5ed9af1438fc22d155095c9b2e08c97ea91980ed0beeaf490b6112ee5196a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.334372 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.346906 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.358613 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.373384 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:57 crc kubenswrapper[4745]: I0319 00:09:57.137606 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:57 crc kubenswrapper[4745]: I0319 00:09:57.137707 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:57 crc kubenswrapper[4745]: E0319 00:09:57.137768 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:57 crc kubenswrapper[4745]: I0319 00:09:57.137629 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:57 crc kubenswrapper[4745]: I0319 00:09:57.137629 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:57 crc kubenswrapper[4745]: E0319 00:09:57.137864 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:57 crc kubenswrapper[4745]: E0319 00:09:57.137954 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:57 crc kubenswrapper[4745]: E0319 00:09:57.138012 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.154423 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.167250 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.167319 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.167338 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.167364 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.167384 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:58Z","lastTransitionTime":"2026-03-19T00:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:58 crc kubenswrapper[4745]: E0319 00:09:58.184098 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.188359 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.188415 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.188427 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.188448 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.188460 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:58Z","lastTransitionTime":"2026-03-19T00:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:58 crc kubenswrapper[4745]: E0319 00:09:58.202219 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.206338 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.206402 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.206417 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.206442 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.206456 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:58Z","lastTransitionTime":"2026-03-19T00:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:58 crc kubenswrapper[4745]: E0319 00:09:58.223938 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.228638 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.228697 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.228708 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.228728 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.228740 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:58Z","lastTransitionTime":"2026-03-19T00:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:58 crc kubenswrapper[4745]: E0319 00:09:58.244318 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.248524 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.248591 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.248614 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.248642 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.248665 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:58Z","lastTransitionTime":"2026-03-19T00:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:58 crc kubenswrapper[4745]: E0319 00:09:58.265705 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:58 crc kubenswrapper[4745]: E0319 00:09:58.265913 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 00:09:59 crc kubenswrapper[4745]: I0319 00:09:59.137685 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:59 crc kubenswrapper[4745]: I0319 00:09:59.137806 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:59 crc kubenswrapper[4745]: E0319 00:09:59.137833 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:59 crc kubenswrapper[4745]: E0319 00:09:59.138010 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:59 crc kubenswrapper[4745]: I0319 00:09:59.138075 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:59 crc kubenswrapper[4745]: I0319 00:09:59.138197 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:59 crc kubenswrapper[4745]: E0319 00:09:59.138229 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:59 crc kubenswrapper[4745]: E0319 00:09:59.138373 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:59 crc kubenswrapper[4745]: I0319 00:09:59.139283 4745 scope.go:117] "RemoveContainer" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:09:59 crc kubenswrapper[4745]: E0319 00:09:59.139483 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" Mar 19 00:10:01 crc kubenswrapper[4745]: I0319 00:10:01.137309 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:01 crc kubenswrapper[4745]: E0319 00:10:01.137835 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:01 crc kubenswrapper[4745]: I0319 00:10:01.137404 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:01 crc kubenswrapper[4745]: E0319 00:10:01.137939 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:01 crc kubenswrapper[4745]: I0319 00:10:01.137365 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:01 crc kubenswrapper[4745]: I0319 00:10:01.137962 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:01 crc kubenswrapper[4745]: E0319 00:10:01.138010 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:01 crc kubenswrapper[4745]: E0319 00:10:01.138165 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:01 crc kubenswrapper[4745]: E0319 00:10:01.263595 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:10:03 crc kubenswrapper[4745]: I0319 00:10:03.137382 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:03 crc kubenswrapper[4745]: I0319 00:10:03.137533 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:03 crc kubenswrapper[4745]: I0319 00:10:03.137625 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:03 crc kubenswrapper[4745]: E0319 00:10:03.137556 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:03 crc kubenswrapper[4745]: E0319 00:10:03.137982 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:03 crc kubenswrapper[4745]: I0319 00:10:03.138147 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:03 crc kubenswrapper[4745]: E0319 00:10:03.138343 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:03 crc kubenswrapper[4745]: E0319 00:10:03.138462 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:05 crc kubenswrapper[4745]: I0319 00:10:05.137667 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:05 crc kubenswrapper[4745]: I0319 00:10:05.137717 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:05 crc kubenswrapper[4745]: I0319 00:10:05.138355 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:05 crc kubenswrapper[4745]: I0319 00:10:05.138819 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:05 crc kubenswrapper[4745]: E0319 00:10:05.143573 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:05 crc kubenswrapper[4745]: E0319 00:10:05.143810 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:05 crc kubenswrapper[4745]: E0319 00:10:05.144282 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:05 crc kubenswrapper[4745]: E0319 00:10:05.145099 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.226691 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5xqfc" podStartSLOduration=132.226669753 podStartE2EDuration="2m12.226669753s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.226138745 +0000 UTC m=+170.764333886" watchObservedRunningTime="2026-03-19 00:10:06.226669753 +0000 UTC m=+170.764864884" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.239438 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podStartSLOduration=132.239417544 podStartE2EDuration="2m12.239417544s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.2393021 +0000 UTC m=+170.777497261" watchObservedRunningTime="2026-03-19 00:10:06.239417544 +0000 UTC m=+170.777612675" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.257497 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" podStartSLOduration=132.257472192 podStartE2EDuration="2m12.257472192s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.256476368 +0000 UTC m=+170.794671499" watchObservedRunningTime="2026-03-19 00:10:06.257472192 +0000 UTC m=+170.795667313" Mar 19 00:10:06 crc kubenswrapper[4745]: E0319 00:10:06.264114 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.277721 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mlwp7" podStartSLOduration=132.277696341 podStartE2EDuration="2m12.277696341s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.275168747 +0000 UTC m=+170.813363888" watchObservedRunningTime="2026-03-19 00:10:06.277696341 +0000 UTC m=+170.815891472" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.296020 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=35.295991666 podStartE2EDuration="35.295991666s" podCreationTimestamp="2026-03-19 00:09:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.295295173 +0000 UTC m=+170.833490304" watchObservedRunningTime="2026-03-19 00:10:06.295991666 +0000 UTC m=+170.834186797" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.305102 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=8.305086267 podStartE2EDuration="8.305086267s" podCreationTimestamp="2026-03-19 00:09:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.304945152 +0000 UTC m=+170.843140313" watchObservedRunningTime="2026-03-19 00:10:06.305086267 +0000 UTC m=+170.843281398" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.324395 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=76.324374295 podStartE2EDuration="1m16.324374295s" podCreationTimestamp="2026-03-19 00:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.322376959 +0000 UTC m=+170.860572090" watchObservedRunningTime="2026-03-19 00:10:06.324374295 +0000 UTC m=+170.862569426" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.338659 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=64.338644217 podStartE2EDuration="1m4.338644217s" podCreationTimestamp="2026-03-19 00:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.337330633 +0000 UTC m=+170.875525764" watchObservedRunningTime="2026-03-19 00:10:06.338644217 +0000 UTC m=+170.876839348" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.402552 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=74.402524621 podStartE2EDuration="1m14.402524621s" podCreationTimestamp="2026-03-19 00:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.400820025 +0000 UTC m=+170.939015186" watchObservedRunningTime="2026-03-19 00:10:06.402524621 +0000 UTC m=+170.940719762" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.451006 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" podStartSLOduration=132.450981684 podStartE2EDuration="2m12.450981684s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.450652663 +0000 UTC m=+170.988847804" watchObservedRunningTime="2026-03-19 00:10:06.450981684 +0000 UTC m=+170.989176815" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.464972 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xjkg8" podStartSLOduration=132.464948487 podStartE2EDuration="2m12.464948487s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.462848486 +0000 UTC m=+171.001043647" watchObservedRunningTime="2026-03-19 00:10:06.464948487 +0000 UTC m=+171.003143628" Mar 19 00:10:07 crc kubenswrapper[4745]: I0319 00:10:07.136849 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:07 crc kubenswrapper[4745]: I0319 00:10:07.136936 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:07 crc kubenswrapper[4745]: I0319 00:10:07.136991 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:07 crc kubenswrapper[4745]: I0319 00:10:07.136855 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:07 crc kubenswrapper[4745]: E0319 00:10:07.137075 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:07 crc kubenswrapper[4745]: E0319 00:10:07.137267 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:07 crc kubenswrapper[4745]: E0319 00:10:07.137377 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:07 crc kubenswrapper[4745]: E0319 00:10:07.137455 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.435259 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.435348 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.435367 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.435396 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.435418 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:10:08Z","lastTransitionTime":"2026-03-19T00:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.508819 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4"] Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.509598 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.513217 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.513457 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.513916 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.515549 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.656565 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecd12a08-b896-4cbc-9688-981ca0494b82-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.656614 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecd12a08-b896-4cbc-9688-981ca0494b82-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.656665 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ecd12a08-b896-4cbc-9688-981ca0494b82-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.656748 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ecd12a08-b896-4cbc-9688-981ca0494b82-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.656798 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecd12a08-b896-4cbc-9688-981ca0494b82-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.757752 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ecd12a08-b896-4cbc-9688-981ca0494b82-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.758283 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ecd12a08-b896-4cbc-9688-981ca0494b82-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.758405 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecd12a08-b896-4cbc-9688-981ca0494b82-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.757991 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ecd12a08-b896-4cbc-9688-981ca0494b82-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.758358 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ecd12a08-b896-4cbc-9688-981ca0494b82-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.758483 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecd12a08-b896-4cbc-9688-981ca0494b82-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.758679 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecd12a08-b896-4cbc-9688-981ca0494b82-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.761793 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecd12a08-b896-4cbc-9688-981ca0494b82-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.766344 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecd12a08-b896-4cbc-9688-981ca0494b82-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.783822 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecd12a08-b896-4cbc-9688-981ca0494b82-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.828506 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:09 crc kubenswrapper[4745]: I0319 00:10:09.057844 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" event={"ID":"ecd12a08-b896-4cbc-9688-981ca0494b82","Type":"ContainerStarted","Data":"c54bb9f661d4b5732582033f80f563740b19734b6d3180d560f25bf5cdd8e4f1"} Mar 19 00:10:09 crc kubenswrapper[4745]: I0319 00:10:09.058365 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" event={"ID":"ecd12a08-b896-4cbc-9688-981ca0494b82","Type":"ContainerStarted","Data":"1a18f51e597541ae70ada00e939b9962390a950a7cf2fd8cee1de8bf8f86d7af"} Mar 19 00:10:09 crc kubenswrapper[4745]: I0319 00:10:09.076435 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" podStartSLOduration=135.07637774 podStartE2EDuration="2m15.07637774s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:09.074163087 +0000 UTC m=+173.612358218" watchObservedRunningTime="2026-03-19 00:10:09.07637774 +0000 UTC m=+173.614572901" Mar 19 00:10:09 crc kubenswrapper[4745]: I0319 00:10:09.137676 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:09 crc kubenswrapper[4745]: I0319 00:10:09.137753 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:09 crc kubenswrapper[4745]: I0319 00:10:09.137694 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:09 crc kubenswrapper[4745]: E0319 00:10:09.137870 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:09 crc kubenswrapper[4745]: I0319 00:10:09.137689 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:09 crc kubenswrapper[4745]: E0319 00:10:09.137993 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:09 crc kubenswrapper[4745]: E0319 00:10:09.138042 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:09 crc kubenswrapper[4745]: E0319 00:10:09.138090 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:09 crc kubenswrapper[4745]: I0319 00:10:09.191022 4745 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 19 00:10:09 crc kubenswrapper[4745]: I0319 00:10:09.204744 4745 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 00:10:11 crc kubenswrapper[4745]: I0319 00:10:11.137193 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:11 crc kubenswrapper[4745]: I0319 00:10:11.137682 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:11 crc kubenswrapper[4745]: I0319 00:10:11.137718 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:11 crc kubenswrapper[4745]: I0319 00:10:11.137742 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:11 crc kubenswrapper[4745]: E0319 00:10:11.138136 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:11 crc kubenswrapper[4745]: E0319 00:10:11.138122 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:11 crc kubenswrapper[4745]: E0319 00:10:11.138204 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:11 crc kubenswrapper[4745]: E0319 00:10:11.138289 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:11 crc kubenswrapper[4745]: E0319 00:10:11.266106 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:10:12 crc kubenswrapper[4745]: I0319 00:10:12.138056 4745 scope.go:117] "RemoveContainer" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:10:12 crc kubenswrapper[4745]: E0319 00:10:12.138258 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" Mar 19 00:10:13 crc kubenswrapper[4745]: I0319 00:10:13.137427 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:13 crc kubenswrapper[4745]: I0319 00:10:13.137422 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:13 crc kubenswrapper[4745]: E0319 00:10:13.137586 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:13 crc kubenswrapper[4745]: I0319 00:10:13.137452 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:13 crc kubenswrapper[4745]: E0319 00:10:13.137637 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:13 crc kubenswrapper[4745]: I0319 00:10:13.137439 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:13 crc kubenswrapper[4745]: E0319 00:10:13.137672 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:13 crc kubenswrapper[4745]: E0319 00:10:13.137706 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:15 crc kubenswrapper[4745]: I0319 00:10:15.137428 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:15 crc kubenswrapper[4745]: I0319 00:10:15.137497 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:15 crc kubenswrapper[4745]: I0319 00:10:15.137498 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:15 crc kubenswrapper[4745]: I0319 00:10:15.137428 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:15 crc kubenswrapper[4745]: E0319 00:10:15.137602 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:15 crc kubenswrapper[4745]: E0319 00:10:15.137657 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:15 crc kubenswrapper[4745]: E0319 00:10:15.137738 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:15 crc kubenswrapper[4745]: E0319 00:10:15.137821 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:16 crc kubenswrapper[4745]: E0319 00:10:16.266658 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:10:17 crc kubenswrapper[4745]: I0319 00:10:17.137147 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:17 crc kubenswrapper[4745]: I0319 00:10:17.137249 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:17 crc kubenswrapper[4745]: E0319 00:10:17.137283 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:17 crc kubenswrapper[4745]: I0319 00:10:17.137311 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:17 crc kubenswrapper[4745]: I0319 00:10:17.137324 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:17 crc kubenswrapper[4745]: E0319 00:10:17.137402 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:17 crc kubenswrapper[4745]: E0319 00:10:17.137515 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:17 crc kubenswrapper[4745]: E0319 00:10:17.137611 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:19 crc kubenswrapper[4745]: I0319 00:10:19.093301 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mlwp7_6a0ae9c0-f19a-4038-be03-0fa6d223ebbf/kube-multus/1.log" Mar 19 00:10:19 crc kubenswrapper[4745]: I0319 00:10:19.093940 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mlwp7_6a0ae9c0-f19a-4038-be03-0fa6d223ebbf/kube-multus/0.log" Mar 19 00:10:19 crc kubenswrapper[4745]: I0319 00:10:19.094027 4745 generic.go:334] "Generic (PLEG): container finished" podID="6a0ae9c0-f19a-4038-be03-0fa6d223ebbf" containerID="486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915" exitCode=1 Mar 19 00:10:19 crc kubenswrapper[4745]: I0319 00:10:19.094074 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mlwp7" event={"ID":"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf","Type":"ContainerDied","Data":"486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915"} Mar 19 00:10:19 crc kubenswrapper[4745]: I0319 00:10:19.094162 4745 scope.go:117] "RemoveContainer" containerID="7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2" Mar 19 00:10:19 crc kubenswrapper[4745]: I0319 00:10:19.094591 4745 scope.go:117] "RemoveContainer" containerID="486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915" Mar 19 00:10:19 crc kubenswrapper[4745]: E0319 00:10:19.094814 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-mlwp7_openshift-multus(6a0ae9c0-f19a-4038-be03-0fa6d223ebbf)\"" pod="openshift-multus/multus-mlwp7" podUID="6a0ae9c0-f19a-4038-be03-0fa6d223ebbf" Mar 19 00:10:19 crc kubenswrapper[4745]: I0319 00:10:19.136768 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:19 crc kubenswrapper[4745]: I0319 00:10:19.136801 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:19 crc kubenswrapper[4745]: I0319 00:10:19.136935 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:19 crc kubenswrapper[4745]: E0319 00:10:19.137019 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:19 crc kubenswrapper[4745]: I0319 00:10:19.137115 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:19 crc kubenswrapper[4745]: E0319 00:10:19.137117 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:19 crc kubenswrapper[4745]: E0319 00:10:19.137219 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:19 crc kubenswrapper[4745]: E0319 00:10:19.137421 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:20 crc kubenswrapper[4745]: I0319 00:10:20.099724 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mlwp7_6a0ae9c0-f19a-4038-be03-0fa6d223ebbf/kube-multus/1.log" Mar 19 00:10:21 crc kubenswrapper[4745]: I0319 00:10:21.137167 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:21 crc kubenswrapper[4745]: I0319 00:10:21.137197 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:21 crc kubenswrapper[4745]: I0319 00:10:21.137241 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:21 crc kubenswrapper[4745]: E0319 00:10:21.138426 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:21 crc kubenswrapper[4745]: E0319 00:10:21.138169 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:21 crc kubenswrapper[4745]: I0319 00:10:21.137257 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:21 crc kubenswrapper[4745]: E0319 00:10:21.138560 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:21 crc kubenswrapper[4745]: E0319 00:10:21.138695 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:21 crc kubenswrapper[4745]: E0319 00:10:21.268110 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:10:23 crc kubenswrapper[4745]: I0319 00:10:23.137696 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:23 crc kubenswrapper[4745]: I0319 00:10:23.137749 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:23 crc kubenswrapper[4745]: I0319 00:10:23.137782 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:23 crc kubenswrapper[4745]: E0319 00:10:23.137837 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:23 crc kubenswrapper[4745]: I0319 00:10:23.137713 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:23 crc kubenswrapper[4745]: E0319 00:10:23.137981 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:23 crc kubenswrapper[4745]: E0319 00:10:23.138047 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:23 crc kubenswrapper[4745]: E0319 00:10:23.138102 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:25 crc kubenswrapper[4745]: I0319 00:10:25.137109 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:25 crc kubenswrapper[4745]: I0319 00:10:25.137166 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:25 crc kubenswrapper[4745]: I0319 00:10:25.137263 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:25 crc kubenswrapper[4745]: E0319 00:10:25.137446 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:25 crc kubenswrapper[4745]: E0319 00:10:25.137553 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:25 crc kubenswrapper[4745]: E0319 00:10:25.137724 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:25 crc kubenswrapper[4745]: I0319 00:10:25.138013 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:25 crc kubenswrapper[4745]: E0319 00:10:25.138124 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:26 crc kubenswrapper[4745]: I0319 00:10:26.139006 4745 scope.go:117] "RemoveContainer" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:10:26 crc kubenswrapper[4745]: E0319 00:10:26.268538 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:10:27 crc kubenswrapper[4745]: I0319 00:10:27.021638 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4r5k5"] Mar 19 00:10:27 crc kubenswrapper[4745]: I0319 00:10:27.021829 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:27 crc kubenswrapper[4745]: E0319 00:10:27.022070 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:27 crc kubenswrapper[4745]: I0319 00:10:27.125145 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/3.log" Mar 19 00:10:27 crc kubenswrapper[4745]: I0319 00:10:27.127838 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec"} Mar 19 00:10:27 crc kubenswrapper[4745]: I0319 00:10:27.128244 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:10:27 crc kubenswrapper[4745]: I0319 00:10:27.137042 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:27 crc kubenswrapper[4745]: I0319 00:10:27.137073 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:27 crc kubenswrapper[4745]: I0319 00:10:27.137047 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:27 crc kubenswrapper[4745]: E0319 00:10:27.137175 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:27 crc kubenswrapper[4745]: E0319 00:10:27.137326 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:27 crc kubenswrapper[4745]: E0319 00:10:27.137357 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:27 crc kubenswrapper[4745]: I0319 00:10:27.157739 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podStartSLOduration=153.157722351 podStartE2EDuration="2m33.157722351s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:27.157666609 +0000 UTC m=+191.695861760" watchObservedRunningTime="2026-03-19 00:10:27.157722351 +0000 UTC m=+191.695917482" Mar 19 00:10:29 crc kubenswrapper[4745]: I0319 00:10:29.137636 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:29 crc kubenswrapper[4745]: I0319 00:10:29.137682 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:29 crc kubenswrapper[4745]: I0319 00:10:29.137665 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:29 crc kubenswrapper[4745]: I0319 00:10:29.137635 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:29 crc kubenswrapper[4745]: E0319 00:10:29.137855 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:29 crc kubenswrapper[4745]: E0319 00:10:29.137756 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:29 crc kubenswrapper[4745]: E0319 00:10:29.137962 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:29 crc kubenswrapper[4745]: E0319 00:10:29.138022 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:30 crc kubenswrapper[4745]: I0319 00:10:30.137585 4745 scope.go:117] "RemoveContainer" containerID="486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915" Mar 19 00:10:31 crc kubenswrapper[4745]: I0319 00:10:31.137226 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:31 crc kubenswrapper[4745]: I0319 00:10:31.137299 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:31 crc kubenswrapper[4745]: E0319 00:10:31.137717 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:31 crc kubenswrapper[4745]: I0319 00:10:31.137371 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:31 crc kubenswrapper[4745]: I0319 00:10:31.137361 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:31 crc kubenswrapper[4745]: E0319 00:10:31.137851 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:31 crc kubenswrapper[4745]: E0319 00:10:31.138029 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:31 crc kubenswrapper[4745]: E0319 00:10:31.138118 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:31 crc kubenswrapper[4745]: I0319 00:10:31.142403 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mlwp7_6a0ae9c0-f19a-4038-be03-0fa6d223ebbf/kube-multus/1.log" Mar 19 00:10:31 crc kubenswrapper[4745]: I0319 00:10:31.142455 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mlwp7" event={"ID":"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf","Type":"ContainerStarted","Data":"24824f54f5d7906d7ff9e415522e7a824bf14a0a7fddbb60e7b205d77b6a0be8"} Mar 19 00:10:31 crc kubenswrapper[4745]: E0319 00:10:31.270657 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:10:33 crc kubenswrapper[4745]: I0319 00:10:33.137371 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:33 crc kubenswrapper[4745]: I0319 00:10:33.137463 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:33 crc kubenswrapper[4745]: I0319 00:10:33.137534 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:33 crc kubenswrapper[4745]: E0319 00:10:33.137533 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:33 crc kubenswrapper[4745]: E0319 00:10:33.137628 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:33 crc kubenswrapper[4745]: I0319 00:10:33.137678 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:33 crc kubenswrapper[4745]: E0319 00:10:33.137767 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:33 crc kubenswrapper[4745]: E0319 00:10:33.137840 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:35 crc kubenswrapper[4745]: I0319 00:10:35.137407 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:35 crc kubenswrapper[4745]: I0319 00:10:35.137498 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:35 crc kubenswrapper[4745]: I0319 00:10:35.137518 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:35 crc kubenswrapper[4745]: I0319 00:10:35.137505 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:35 crc kubenswrapper[4745]: E0319 00:10:35.137678 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:35 crc kubenswrapper[4745]: E0319 00:10:35.137852 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:35 crc kubenswrapper[4745]: E0319 00:10:35.137963 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:35 crc kubenswrapper[4745]: E0319 00:10:35.138036 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:37 crc kubenswrapper[4745]: I0319 00:10:37.137038 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:37 crc kubenswrapper[4745]: I0319 00:10:37.137095 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:37 crc kubenswrapper[4745]: I0319 00:10:37.137124 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:37 crc kubenswrapper[4745]: I0319 00:10:37.137309 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:37 crc kubenswrapper[4745]: I0319 00:10:37.142230 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 00:10:37 crc kubenswrapper[4745]: I0319 00:10:37.143212 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 00:10:37 crc kubenswrapper[4745]: I0319 00:10:37.143401 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 00:10:37 crc kubenswrapper[4745]: I0319 00:10:37.143475 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 00:10:37 crc kubenswrapper[4745]: I0319 00:10:37.143831 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 00:10:37 crc kubenswrapper[4745]: I0319 00:10:37.144278 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 19 00:10:38 crc kubenswrapper[4745]: I0319 00:10:38.981388 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.027999 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-h477l"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.028611 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.029017 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hg72d"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.029517 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.031736 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v4wtx"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.032248 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zxjjt"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.032270 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.033257 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.033608 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.034569 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.034581 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.035678 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.035708 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.036855 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.036895 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.037491 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.037802 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-ssbjs"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.038271 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.043711 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.044403 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.044813 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.046689 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.047257 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-ljtrr"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.047570 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ljtrr" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.047955 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.049332 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.049566 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.049681 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.050467 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.050495 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.051426 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29564640-xrq9h"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.052092 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29564640-xrq9h" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.054526 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.054841 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.055274 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.056303 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.056316 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.058970 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.059848 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.070990 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.071279 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.072219 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.073306 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.073585 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.081383 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.081982 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.082193 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.082395 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.083376 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.083536 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.084147 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.084334 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.085656 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.086859 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.087364 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.087452 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.088012 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.088166 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.089648 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108605 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tfct\" (UniqueName: \"kubernetes.io/projected/d82019fe-0d36-4087-83db-41c03fa4fc66-kube-api-access-8tfct\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108641 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bee68b29-e3e7-4a15-9bda-981764261dcc-audit-dir\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108664 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-etcd-serving-ca\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108682 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/660e3fac-6534-49e0-a81e-38971c9fec3f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108705 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108723 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bee68b29-e3e7-4a15-9bda-981764261dcc-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108743 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-config\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108759 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhxsw\" (UniqueName: \"kubernetes.io/projected/46576b1f-4646-44ba-a896-d509b05801cd-kube-api-access-vhxsw\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108776 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d82019fe-0d36-4087-83db-41c03fa4fc66-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108794 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-client-ca\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108819 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/555c1cf8-c2b3-4e47-9fa9-314a8672b437-serving-cert\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108858 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e036dd-6b1a-48ec-a9f4-a976673a6208-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-445gs\" (UID: \"83e036dd-6b1a-48ec-a9f4-a976673a6208\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108873 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8bxx\" (UniqueName: \"kubernetes.io/projected/660e3fac-6534-49e0-a81e-38971c9fec3f-kube-api-access-x8bxx\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108912 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46576b1f-4646-44ba-a896-d509b05801cd-serving-cert\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108927 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bee68b29-e3e7-4a15-9bda-981764261dcc-etcd-client\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108945 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zznl9\" (UniqueName: \"kubernetes.io/projected/16dafcd2-537e-46fe-8028-41bc6ff146a0-kube-api-access-zznl9\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108962 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-config\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108976 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-audit\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108991 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d82019fe-0d36-4087-83db-41c03fa4fc66-service-ca-bundle\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109022 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bee68b29-e3e7-4a15-9bda-981764261dcc-encryption-config\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109038 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46576b1f-4646-44ba-a896-d509b05801cd-audit-dir\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109065 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxp4h\" (UniqueName: \"kubernetes.io/projected/bee68b29-e3e7-4a15-9bda-981764261dcc-kube-api-access-rxp4h\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109085 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/46576b1f-4646-44ba-a896-d509b05801cd-encryption-config\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109105 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-866gr\" (UniqueName: \"kubernetes.io/projected/555c1cf8-c2b3-4e47-9fa9-314a8672b437-kube-api-access-866gr\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109123 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-image-import-ca\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109143 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bee68b29-e3e7-4a15-9bda-981764261dcc-serving-cert\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109160 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bee68b29-e3e7-4a15-9bda-981764261dcc-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109180 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bee68b29-e3e7-4a15-9bda-981764261dcc-audit-policies\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109219 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dafcd2-537e-46fe-8028-41bc6ff146a0-serving-cert\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109238 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-config\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109256 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/660e3fac-6534-49e0-a81e-38971c9fec3f-config\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109273 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d82019fe-0d36-4087-83db-41c03fa4fc66-config\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109290 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/660e3fac-6534-49e0-a81e-38971c9fec3f-images\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109304 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl6fn\" (UniqueName: \"kubernetes.io/projected/83e036dd-6b1a-48ec-a9f4-a976673a6208-kube-api-access-tl6fn\") pod \"openshift-apiserver-operator-796bbdcf4f-445gs\" (UID: \"83e036dd-6b1a-48ec-a9f4-a976673a6208\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109323 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-client-ca\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109339 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109354 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e036dd-6b1a-48ec-a9f4-a976673a6208-config\") pod \"openshift-apiserver-operator-796bbdcf4f-445gs\" (UID: \"83e036dd-6b1a-48ec-a9f4-a976673a6208\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109368 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d82019fe-0d36-4087-83db-41c03fa4fc66-serving-cert\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109385 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46576b1f-4646-44ba-a896-d509b05801cd-node-pullsecrets\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109400 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/46576b1f-4646-44ba-a896-d509b05801cd-etcd-client\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109575 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-522nc"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110007 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110087 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110121 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110177 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110203 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110328 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110456 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110529 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110625 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110639 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110639 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110640 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.111190 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.113034 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.113059 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.113202 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.113282 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.113415 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.113608 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.113669 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.113602 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.113936 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.114074 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.114325 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.114457 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.114492 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.114588 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.114670 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.114756 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.114807 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.114927 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.113203 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115130 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115196 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115327 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115356 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115487 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115549 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115584 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115671 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115700 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115742 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115807 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115496 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115811 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.117009 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.118455 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.118957 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.119367 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.122049 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fpxzh"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.122722 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.123228 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.123755 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.125923 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.126061 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.126158 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.126533 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.127165 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.131467 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-jrq7v"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.132429 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.132542 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.132773 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c8nmg"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.133623 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.135040 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.135493 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.138841 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.139053 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.139322 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.140181 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.140679 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.140824 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-j2mf5"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.141075 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.141134 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.141516 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.141685 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.142122 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.142280 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.142406 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.163054 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.163869 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.164412 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.165143 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.167204 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.169663 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.174126 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.202663 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.204212 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.205153 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.206448 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.206761 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.207702 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.208002 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.208123 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qn8c4"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.210361 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-config\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.210988 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211099 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zznl9\" (UniqueName: \"kubernetes.io/projected/16dafcd2-537e-46fe-8028-41bc6ff146a0-kube-api-access-zznl9\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.210810 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.210498 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.210597 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211177 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46576b1f-4646-44ba-a896-d509b05801cd-serving-cert\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211612 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bee68b29-e3e7-4a15-9bda-981764261dcc-etcd-client\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211643 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-config\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211680 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-audit\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211702 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d82019fe-0d36-4087-83db-41c03fa4fc66-service-ca-bundle\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211725 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab89302b-10a8-43fa-ad93-699274acaac3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8xctf\" (UID: \"ab89302b-10a8-43fa-ad93-699274acaac3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211774 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-default-certificate\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211807 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46576b1f-4646-44ba-a896-d509b05801cd-audit-dir\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211834 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bee68b29-e3e7-4a15-9bda-981764261dcc-encryption-config\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211857 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/116f15d2-ff67-4a98-846a-29bd6a129bbd-serving-cert\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211903 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fknf6\" (UniqueName: \"kubernetes.io/projected/116f15d2-ff67-4a98-846a-29bd6a129bbd-kube-api-access-fknf6\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211928 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-service-ca\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211954 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-auth-proxy-config\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211985 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212012 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-service-ca-bundle\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212038 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/88004414-de81-4e3c-9f3f-99f90a3bbc98-serviceca\") pod \"image-pruner-29564640-xrq9h\" (UID: \"88004414-de81-4e3c-9f3f-99f90a3bbc98\") " pod="openshift-image-registry/image-pruner-29564640-xrq9h" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212060 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnckx\" (UniqueName: \"kubernetes.io/projected/88004414-de81-4e3c-9f3f-99f90a3bbc98-kube-api-access-lnckx\") pod \"image-pruner-29564640-xrq9h\" (UID: \"88004414-de81-4e3c-9f3f-99f90a3bbc98\") " pod="openshift-image-registry/image-pruner-29564640-xrq9h" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212087 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-metrics-certs\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212115 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxp4h\" (UniqueName: \"kubernetes.io/projected/bee68b29-e3e7-4a15-9bda-981764261dcc-kube-api-access-rxp4h\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212136 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/116f15d2-ff67-4a98-846a-29bd6a129bbd-etcd-client\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212156 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/053b13b0-078a-45ea-a005-e38aab17b42f-console-serving-cert\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212181 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rssq\" (UniqueName: \"kubernetes.io/projected/053b13b0-078a-45ea-a005-e38aab17b42f-kube-api-access-7rssq\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212204 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsb2s\" (UniqueName: \"kubernetes.io/projected/19580e75-5123-4261-ac5c-96dbd7834613-kube-api-access-tsb2s\") pod \"openshift-config-operator-7777fb866f-5gsjw\" (UID: \"19580e75-5123-4261-ac5c-96dbd7834613\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212239 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/46576b1f-4646-44ba-a896-d509b05801cd-encryption-config\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212270 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-866gr\" (UniqueName: \"kubernetes.io/projected/555c1cf8-c2b3-4e47-9fa9-314a8672b437-kube-api-access-866gr\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212292 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-image-import-ca\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212320 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bee68b29-e3e7-4a15-9bda-981764261dcc-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212345 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bee68b29-e3e7-4a15-9bda-981764261dcc-serving-cert\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212366 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bee68b29-e3e7-4a15-9bda-981764261dcc-audit-policies\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212384 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab89302b-10a8-43fa-ad93-699274acaac3-config\") pod \"kube-apiserver-operator-766d6c64bb-8xctf\" (UID: \"ab89302b-10a8-43fa-ad93-699274acaac3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212401 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-trusted-ca-bundle\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212423 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/116f15d2-ff67-4a98-846a-29bd6a129bbd-config\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212441 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-machine-approver-tls\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212457 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-console-config\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212482 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/116f15d2-ff67-4a98-846a-29bd6a129bbd-etcd-ca\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212501 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-dir\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212544 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19580e75-5123-4261-ac5c-96dbd7834613-serving-cert\") pod \"openshift-config-operator-7777fb866f-5gsjw\" (UID: \"19580e75-5123-4261-ac5c-96dbd7834613\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212565 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212584 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dafcd2-537e-46fe-8028-41bc6ff146a0-serving-cert\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212603 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-config\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212622 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212641 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/660e3fac-6534-49e0-a81e-38971c9fec3f-config\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212659 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-policies\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212678 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/660e3fac-6534-49e0-a81e-38971c9fec3f-images\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212696 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d82019fe-0d36-4087-83db-41c03fa4fc66-config\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212713 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/199f2552-58de-4ea8-adf5-f1aee925f49b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-74pfb\" (UID: \"199f2552-58de-4ea8-adf5-f1aee925f49b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212731 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/199f2552-58de-4ea8-adf5-f1aee925f49b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-74pfb\" (UID: \"199f2552-58de-4ea8-adf5-f1aee925f49b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212748 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdpdj\" (UniqueName: \"kubernetes.io/projected/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-kube-api-access-tdpdj\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212765 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212787 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd804c03-3021-44bd-8ce8-a10a482c59b4-config\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212806 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-client-ca\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212824 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl6fn\" (UniqueName: \"kubernetes.io/projected/83e036dd-6b1a-48ec-a9f4-a976673a6208-kube-api-access-tl6fn\") pod \"openshift-apiserver-operator-796bbdcf4f-445gs\" (UID: \"83e036dd-6b1a-48ec-a9f4-a976673a6208\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212842 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jnqh\" (UniqueName: \"kubernetes.io/projected/1cee17e3-a84a-42b3-8cbf-9e4dd2c76330-kube-api-access-9jnqh\") pod \"cluster-samples-operator-665b6dd947-m7nb5\" (UID: \"1cee17e3-a84a-42b3-8cbf-9e4dd2c76330\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212859 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw6nw\" (UniqueName: \"kubernetes.io/projected/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-kube-api-access-cw6nw\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212898 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212916 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212936 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e036dd-6b1a-48ec-a9f4-a976673a6208-config\") pod \"openshift-apiserver-operator-796bbdcf4f-445gs\" (UID: \"83e036dd-6b1a-48ec-a9f4-a976673a6208\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212953 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d82019fe-0d36-4087-83db-41c03fa4fc66-serving-cert\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212968 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/116f15d2-ff67-4a98-846a-29bd6a129bbd-etcd-service-ca\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212984 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88wjz\" (UniqueName: \"kubernetes.io/projected/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-kube-api-access-88wjz\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213000 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-stats-auth\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213018 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/46576b1f-4646-44ba-a896-d509b05801cd-etcd-client\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213038 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46576b1f-4646-44ba-a896-d509b05801cd-node-pullsecrets\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213055 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-etcd-serving-ca\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213070 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tfct\" (UniqueName: \"kubernetes.io/projected/d82019fe-0d36-4087-83db-41c03fa4fc66-kube-api-access-8tfct\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213086 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bee68b29-e3e7-4a15-9bda-981764261dcc-audit-dir\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213107 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/660e3fac-6534-49e0-a81e-38971c9fec3f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213125 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/053b13b0-078a-45ea-a005-e38aab17b42f-console-oauth-config\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213140 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5npf\" (UniqueName: \"kubernetes.io/projected/fd804c03-3021-44bd-8ce8-a10a482c59b4-kube-api-access-x5npf\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213163 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213437 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-config\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213871 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hg72d"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.214008 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.214947 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.214977 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-oauth-serving-cert\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215003 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd804c03-3021-44bd-8ce8-a10a482c59b4-serving-cert\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215021 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bee68b29-e3e7-4a15-9bda-981764261dcc-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215040 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215063 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-config\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215079 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhxsw\" (UniqueName: \"kubernetes.io/projected/46576b1f-4646-44ba-a896-d509b05801cd-kube-api-access-vhxsw\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215096 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d82019fe-0d36-4087-83db-41c03fa4fc66-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215114 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-client-ca\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215133 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzlgl\" (UniqueName: \"kubernetes.io/projected/199f2552-58de-4ea8-adf5-f1aee925f49b-kube-api-access-fzlgl\") pod \"openshift-controller-manager-operator-756b6f6bc6-74pfb\" (UID: \"199f2552-58de-4ea8-adf5-f1aee925f49b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215156 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab89302b-10a8-43fa-ad93-699274acaac3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8xctf\" (UID: \"ab89302b-10a8-43fa-ad93-699274acaac3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215173 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215192 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215210 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/19580e75-5123-4261-ac5c-96dbd7834613-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5gsjw\" (UID: \"19580e75-5123-4261-ac5c-96dbd7834613\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215232 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/555c1cf8-c2b3-4e47-9fa9-314a8672b437-serving-cert\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215248 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2fgf\" (UniqueName: \"kubernetes.io/projected/38dd3b53-64de-4201-b427-0b1bc3e51849-kube-api-access-z2fgf\") pod \"downloads-7954f5f757-ljtrr\" (UID: \"38dd3b53-64de-4201-b427-0b1bc3e51849\") " pod="openshift-console/downloads-7954f5f757-ljtrr" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215265 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd804c03-3021-44bd-8ce8-a10a482c59b4-trusted-ca\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215294 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8bxx\" (UniqueName: \"kubernetes.io/projected/660e3fac-6534-49e0-a81e-38971c9fec3f-kube-api-access-x8bxx\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215312 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1cee17e3-a84a-42b3-8cbf-9e4dd2c76330-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-m7nb5\" (UID: \"1cee17e3-a84a-42b3-8cbf-9e4dd2c76330\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215328 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215353 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e036dd-6b1a-48ec-a9f4-a976673a6208-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-445gs\" (UID: \"83e036dd-6b1a-48ec-a9f4-a976673a6208\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.216239 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211777 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.216813 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-audit\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.216945 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d82019fe-0d36-4087-83db-41c03fa4fc66-service-ca-bundle\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.216960 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46576b1f-4646-44ba-a896-d509b05801cd-audit-dir\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.217798 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-client-ca\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.217955 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-image-import-ca\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.218565 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bee68b29-e3e7-4a15-9bda-981764261dcc-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.219342 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e036dd-6b1a-48ec-a9f4-a976673a6208-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-445gs\" (UID: \"83e036dd-6b1a-48ec-a9f4-a976673a6208\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.221992 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bee68b29-e3e7-4a15-9bda-981764261dcc-audit-policies\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.222357 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d82019fe-0d36-4087-83db-41c03fa4fc66-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.222375 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d82019fe-0d36-4087-83db-41c03fa4fc66-config\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.222965 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bee68b29-e3e7-4a15-9bda-981764261dcc-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.223031 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bee68b29-e3e7-4a15-9bda-981764261dcc-audit-dir\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.223196 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/46576b1f-4646-44ba-a896-d509b05801cd-encryption-config\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.223283 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.223333 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46576b1f-4646-44ba-a896-d509b05801cd-node-pullsecrets\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.223385 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/660e3fac-6534-49e0-a81e-38971c9fec3f-config\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.223636 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bee68b29-e3e7-4a15-9bda-981764261dcc-etcd-client\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.223776 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-config\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.224497 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-client-ca\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.224654 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dafcd2-537e-46fe-8028-41bc6ff146a0-serving-cert\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.224813 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-config\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.225057 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bee68b29-e3e7-4a15-9bda-981764261dcc-encryption-config\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.225468 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-etcd-serving-ca\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.226625 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46576b1f-4646-44ba-a896-d509b05801cd-serving-cert\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.226668 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bee68b29-e3e7-4a15-9bda-981764261dcc-serving-cert\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.227835 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/555c1cf8-c2b3-4e47-9fa9-314a8672b437-serving-cert\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.228586 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/660e3fac-6534-49e0-a81e-38971c9fec3f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.230699 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d82019fe-0d36-4087-83db-41c03fa4fc66-serving-cert\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.231536 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/46576b1f-4646-44ba-a896-d509b05801cd-etcd-client\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.232668 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.235077 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggt62"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.235623 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.237295 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.239162 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/660e3fac-6534-49e0-a81e-38971c9fec3f-images\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.239233 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.240186 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t99wg"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.240291 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.241105 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.243570 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.244184 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564650-7k6ld"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.244311 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.244752 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e036dd-6b1a-48ec-a9f4-a976673a6208-config\") pod \"openshift-apiserver-operator-796bbdcf4f-445gs\" (UID: \"83e036dd-6b1a-48ec-a9f4-a976673a6208\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.244916 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.245106 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.245869 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.250358 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.251198 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.251711 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.252113 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.252151 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.252455 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.253488 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.254569 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-h477l"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.254605 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.254895 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.255739 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.256158 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v4wtx"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.257107 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.258410 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-v2dhg"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.259165 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.259165 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.259341 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.259984 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.260855 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ssbjs"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.262143 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.263498 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ljtrr"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.268555 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29564640-xrq9h"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.269714 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.271227 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.272379 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.275538 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8m25n"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.281561 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-t28kd"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.282168 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.284815 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.285650 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.288592 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.291112 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.291250 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.292718 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.293777 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.295062 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.296228 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564650-7k6ld"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.297624 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggt62"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.299988 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zxjjt"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.301168 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fpxzh"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.302764 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.304029 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.304849 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.305988 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.307192 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t99wg"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.308526 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.310415 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.311200 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.311428 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-54kzj"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.312864 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.313006 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.314110 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c8nmg"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.315218 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-522nc"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.315976 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316006 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-service-ca-bundle\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316029 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-auth-proxy-config\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316051 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/88004414-de81-4e3c-9f3f-99f90a3bbc98-serviceca\") pod \"image-pruner-29564640-xrq9h\" (UID: \"88004414-de81-4e3c-9f3f-99f90a3bbc98\") " pod="openshift-image-registry/image-pruner-29564640-xrq9h" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316070 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnckx\" (UniqueName: \"kubernetes.io/projected/88004414-de81-4e3c-9f3f-99f90a3bbc98-kube-api-access-lnckx\") pod \"image-pruner-29564640-xrq9h\" (UID: \"88004414-de81-4e3c-9f3f-99f90a3bbc98\") " pod="openshift-image-registry/image-pruner-29564640-xrq9h" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316090 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-metrics-certs\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316105 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/116f15d2-ff67-4a98-846a-29bd6a129bbd-etcd-client\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316124 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/053b13b0-078a-45ea-a005-e38aab17b42f-console-serving-cert\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316145 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rssq\" (UniqueName: \"kubernetes.io/projected/053b13b0-078a-45ea-a005-e38aab17b42f-kube-api-access-7rssq\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316163 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsb2s\" (UniqueName: \"kubernetes.io/projected/19580e75-5123-4261-ac5c-96dbd7834613-kube-api-access-tsb2s\") pod \"openshift-config-operator-7777fb866f-5gsjw\" (UID: \"19580e75-5123-4261-ac5c-96dbd7834613\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316231 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab89302b-10a8-43fa-ad93-699274acaac3-config\") pod \"kube-apiserver-operator-766d6c64bb-8xctf\" (UID: \"ab89302b-10a8-43fa-ad93-699274acaac3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316252 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-trusted-ca-bundle\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316284 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/116f15d2-ff67-4a98-846a-29bd6a129bbd-config\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316301 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-machine-approver-tls\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316316 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-console-config\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316336 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/116f15d2-ff67-4a98-846a-29bd6a129bbd-etcd-ca\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316354 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-dir\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316392 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19580e75-5123-4261-ac5c-96dbd7834613-serving-cert\") pod \"openshift-config-operator-7777fb866f-5gsjw\" (UID: \"19580e75-5123-4261-ac5c-96dbd7834613\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316411 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316428 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-policies\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316448 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316466 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/199f2552-58de-4ea8-adf5-f1aee925f49b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-74pfb\" (UID: \"199f2552-58de-4ea8-adf5-f1aee925f49b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316484 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/199f2552-58de-4ea8-adf5-f1aee925f49b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-74pfb\" (UID: \"199f2552-58de-4ea8-adf5-f1aee925f49b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316502 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdpdj\" (UniqueName: \"kubernetes.io/projected/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-kube-api-access-tdpdj\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316518 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316537 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd804c03-3021-44bd-8ce8-a10a482c59b4-config\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316563 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jnqh\" (UniqueName: \"kubernetes.io/projected/1cee17e3-a84a-42b3-8cbf-9e4dd2c76330-kube-api-access-9jnqh\") pod \"cluster-samples-operator-665b6dd947-m7nb5\" (UID: \"1cee17e3-a84a-42b3-8cbf-9e4dd2c76330\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316579 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw6nw\" (UniqueName: \"kubernetes.io/projected/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-kube-api-access-cw6nw\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316597 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316617 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/116f15d2-ff67-4a98-846a-29bd6a129bbd-etcd-service-ca\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316635 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88wjz\" (UniqueName: \"kubernetes.io/projected/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-kube-api-access-88wjz\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316656 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-stats-auth\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316683 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/053b13b0-078a-45ea-a005-e38aab17b42f-console-oauth-config\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316703 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5npf\" (UniqueName: \"kubernetes.io/projected/fd804c03-3021-44bd-8ce8-a10a482c59b4-kube-api-access-x5npf\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316724 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-oauth-serving-cert\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316741 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd804c03-3021-44bd-8ce8-a10a482c59b4-serving-cert\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316767 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316784 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316810 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzlgl\" (UniqueName: \"kubernetes.io/projected/199f2552-58de-4ea8-adf5-f1aee925f49b-kube-api-access-fzlgl\") pod \"openshift-controller-manager-operator-756b6f6bc6-74pfb\" (UID: \"199f2552-58de-4ea8-adf5-f1aee925f49b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316829 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab89302b-10a8-43fa-ad93-699274acaac3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8xctf\" (UID: \"ab89302b-10a8-43fa-ad93-699274acaac3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316846 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316862 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316898 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/19580e75-5123-4261-ac5c-96dbd7834613-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5gsjw\" (UID: \"19580e75-5123-4261-ac5c-96dbd7834613\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316919 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2fgf\" (UniqueName: \"kubernetes.io/projected/38dd3b53-64de-4201-b427-0b1bc3e51849-kube-api-access-z2fgf\") pod \"downloads-7954f5f757-ljtrr\" (UID: \"38dd3b53-64de-4201-b427-0b1bc3e51849\") " pod="openshift-console/downloads-7954f5f757-ljtrr" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316936 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd804c03-3021-44bd-8ce8-a10a482c59b4-trusted-ca\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316963 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1cee17e3-a84a-42b3-8cbf-9e4dd2c76330-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-m7nb5\" (UID: \"1cee17e3-a84a-42b3-8cbf-9e4dd2c76330\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316991 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317025 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-config\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317045 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317080 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab89302b-10a8-43fa-ad93-699274acaac3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8xctf\" (UID: \"ab89302b-10a8-43fa-ad93-699274acaac3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317103 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-default-certificate\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317131 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/116f15d2-ff67-4a98-846a-29bd6a129bbd-serving-cert\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317150 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317174 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-auth-proxy-config\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317152 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fknf6\" (UniqueName: \"kubernetes.io/projected/116f15d2-ff67-4a98-846a-29bd6a129bbd-kube-api-access-fknf6\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317330 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-service-ca\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317824 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-m56ls"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.318135 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-service-ca\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.318818 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.318942 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab89302b-10a8-43fa-ad93-699274acaac3-config\") pod \"kube-apiserver-operator-766d6c64bb-8xctf\" (UID: \"ab89302b-10a8-43fa-ad93-699274acaac3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.318961 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m56ls" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317846 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/88004414-de81-4e3c-9f3f-99f90a3bbc98-serviceca\") pod \"image-pruner-29564640-xrq9h\" (UID: \"88004414-de81-4e3c-9f3f-99f90a3bbc98\") " pod="openshift-image-registry/image-pruner-29564640-xrq9h" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.319482 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-policies\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.319980 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.320388 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-trusted-ca-bundle\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.320589 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-config\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.320921 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.320957 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/19580e75-5123-4261-ac5c-96dbd7834613-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5gsjw\" (UID: \"19580e75-5123-4261-ac5c-96dbd7834613\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.321950 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-dir\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.322209 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd804c03-3021-44bd-8ce8-a10a482c59b4-trusted-ca\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.322475 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qn8c4"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.322536 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.322552 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.324346 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-oauth-serving-cert\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.324726 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/053b13b0-078a-45ea-a005-e38aab17b42f-console-serving-cert\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.325099 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-console-config\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.325138 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.325165 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/199f2552-58de-4ea8-adf5-f1aee925f49b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-74pfb\" (UID: \"199f2552-58de-4ea8-adf5-f1aee925f49b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.325552 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/053b13b0-078a-45ea-a005-e38aab17b42f-console-oauth-config\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.325782 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd804c03-3021-44bd-8ce8-a10a482c59b4-config\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.326726 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/199f2552-58de-4ea8-adf5-f1aee925f49b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-74pfb\" (UID: \"199f2552-58de-4ea8-adf5-f1aee925f49b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.327200 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-j2mf5"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.327234 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t28kd"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.327381 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.327524 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-machine-approver-tls\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.328347 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.329385 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.329461 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.330584 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.330662 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-v2dhg"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.330901 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1cee17e3-a84a-42b3-8cbf-9e4dd2c76330-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-m7nb5\" (UID: \"1cee17e3-a84a-42b3-8cbf-9e4dd2c76330\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.331342 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.331983 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19580e75-5123-4261-ac5c-96dbd7834613-serving-cert\") pod \"openshift-config-operator-7777fb866f-5gsjw\" (UID: \"19580e75-5123-4261-ac5c-96dbd7834613\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.332214 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-54kzj"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.330232 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.333291 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m56ls"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.333727 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-stats-auth\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.334459 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.335893 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd804c03-3021-44bd-8ce8-a10a482c59b4-serving-cert\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.336106 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab89302b-10a8-43fa-ad93-699274acaac3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8xctf\" (UID: \"ab89302b-10a8-43fa-ad93-699274acaac3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.336847 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.337135 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.338479 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-default-certificate\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.341548 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-metrics-certs\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.345274 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.351214 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.357589 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-service-ca-bundle\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.371272 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.390634 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.410604 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.411382 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/116f15d2-ff67-4a98-846a-29bd6a129bbd-config\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.431358 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.436754 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/116f15d2-ff67-4a98-846a-29bd6a129bbd-serving-cert\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.451576 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.459979 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/116f15d2-ff67-4a98-846a-29bd6a129bbd-etcd-client\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.471610 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.480095 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/116f15d2-ff67-4a98-846a-29bd6a129bbd-etcd-service-ca\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.491479 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.512070 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.513861 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/116f15d2-ff67-4a98-846a-29bd6a129bbd-etcd-ca\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.531988 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.562257 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.571873 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.591571 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.636092 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.651397 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.671518 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.691872 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.712046 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.731907 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.751509 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.773246 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.791357 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.811321 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.831936 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.851913 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.892423 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.898510 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zznl9\" (UniqueName: \"kubernetes.io/projected/16dafcd2-537e-46fe-8028-41bc6ff146a0-kube-api-access-zznl9\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.911124 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.932439 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.958824 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.970830 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.009833 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.012051 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.017364 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-866gr\" (UniqueName: \"kubernetes.io/projected/555c1cf8-c2b3-4e47-9fa9-314a8672b437-kube-api-access-866gr\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.031477 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.033575 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.051914 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.071512 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.110393 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.112450 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.153064 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.159639 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl6fn\" (UniqueName: \"kubernetes.io/projected/83e036dd-6b1a-48ec-a9f4-a976673a6208-kube-api-access-tl6fn\") pod \"openshift-apiserver-operator-796bbdcf4f-445gs\" (UID: \"83e036dd-6b1a-48ec-a9f4-a976673a6208\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.171869 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.191246 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.223944 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v4wtx"] Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.229094 4745 request.go:700] Waited for 1.006551228s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/serviceaccounts/authentication-operator/token Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.230746 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhxsw\" (UniqueName: \"kubernetes.io/projected/46576b1f-4646-44ba-a896-d509b05801cd-kube-api-access-vhxsw\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.249493 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tfct\" (UniqueName: \"kubernetes.io/projected/d82019fe-0d36-4087-83db-41c03fa4fc66-kube-api-access-8tfct\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.250986 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:40 crc kubenswrapper[4745]: W0319 00:10:40.259235 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16dafcd2_537e_46fe_8028_41bc6ff146a0.slice/crio-4ccfdcfde3a6b6c854686aad74a4c94d360f5531e573b9fe69f9444980375eba WatchSource:0}: Error finding container 4ccfdcfde3a6b6c854686aad74a4c94d360f5531e573b9fe69f9444980375eba: Status 404 returned error can't find the container with id 4ccfdcfde3a6b6c854686aad74a4c94d360f5531e573b9fe69f9444980375eba Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.265176 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj"] Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.273142 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxp4h\" (UniqueName: \"kubernetes.io/projected/bee68b29-e3e7-4a15-9bda-981764261dcc-kube-api-access-rxp4h\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.289093 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8bxx\" (UniqueName: \"kubernetes.io/projected/660e3fac-6534-49e0-a81e-38971c9fec3f-kube-api-access-x8bxx\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.293057 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.311311 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.324145 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.330824 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.351505 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.371477 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.390765 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.404479 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.412402 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.431199 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.434265 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.451430 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.456294 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-h477l"] Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.471852 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: W0319 00:10:40.480045 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd82019fe_0d36_4087_83db_41c03fa4fc66.slice/crio-206dbc7415f07b101faed205fca7d56e9c46ca95fa8268a4a85f25e579d05c59 WatchSource:0}: Error finding container 206dbc7415f07b101faed205fca7d56e9c46ca95fa8268a4a85f25e579d05c59: Status 404 returned error can't find the container with id 206dbc7415f07b101faed205fca7d56e9c46ca95fa8268a4a85f25e579d05c59 Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.491063 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.512965 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.530355 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zxjjt"] Mar 19 00:10:40 crc kubenswrapper[4745]: W0319 00:10:40.548799 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46576b1f_4646_44ba_a896_d509b05801cd.slice/crio-024541ff85ddb6b0038e11579ec2a88508155e56b4fc962bb2fc758deb3c6ed9 WatchSource:0}: Error finding container 024541ff85ddb6b0038e11579ec2a88508155e56b4fc962bb2fc758deb3c6ed9: Status 404 returned error can't find the container with id 024541ff85ddb6b0038e11579ec2a88508155e56b4fc962bb2fc758deb3c6ed9 Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.551092 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.567249 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.572503 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.590839 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.611022 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.620345 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs"] Mar 19 00:10:40 crc kubenswrapper[4745]: W0319 00:10:40.629537 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83e036dd_6b1a_48ec_a9f4_a976673a6208.slice/crio-592e50c93e589da5412df7388373e65ebf8a92b12ad4bdfe866440f8b3ee5dea WatchSource:0}: Error finding container 592e50c93e589da5412df7388373e65ebf8a92b12ad4bdfe866440f8b3ee5dea: Status 404 returned error can't find the container with id 592e50c93e589da5412df7388373e65ebf8a92b12ad4bdfe866440f8b3ee5dea Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.631946 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.652869 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.653268 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv"] Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.676557 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.692311 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.712216 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.767776 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.768347 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.770394 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.792265 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.797416 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hg72d"] Mar 19 00:10:40 crc kubenswrapper[4745]: W0319 00:10:40.809585 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod660e3fac_6534_49e0_a81e_38971c9fec3f.slice/crio-763d62f0cde3fdb647231e2cf21dddd2e9d6cc7c67dc8e556eda3ff5255126df WatchSource:0}: Error finding container 763d62f0cde3fdb647231e2cf21dddd2e9d6cc7c67dc8e556eda3ff5255126df: Status 404 returned error can't find the container with id 763d62f0cde3fdb647231e2cf21dddd2e9d6cc7c67dc8e556eda3ff5255126df Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.810421 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.830990 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.851504 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.870797 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.891628 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.911597 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.932163 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.951977 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.970339 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.993951 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.013489 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.032127 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.051325 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.071231 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.091399 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.111755 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.131083 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.150626 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.171219 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.192418 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.211154 4745 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.212624 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" event={"ID":"555c1cf8-c2b3-4e47-9fa9-314a8672b437","Type":"ContainerStarted","Data":"adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.212710 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" event={"ID":"555c1cf8-c2b3-4e47-9fa9-314a8672b437","Type":"ContainerStarted","Data":"4cb2967d4cad71f05028927d26e36d1cb64a836920dd56873c07f0bfdd7a7214"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.212944 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.215213 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" event={"ID":"16dafcd2-537e-46fe-8028-41bc6ff146a0","Type":"ContainerStarted","Data":"47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.215277 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" event={"ID":"16dafcd2-537e-46fe-8028-41bc6ff146a0","Type":"ContainerStarted","Data":"4ccfdcfde3a6b6c854686aad74a4c94d360f5531e573b9fe69f9444980375eba"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.215381 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.217267 4745 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-v4wtx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.217349 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" podUID="16dafcd2-537e-46fe-8028-41bc6ff146a0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.219851 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" event={"ID":"83e036dd-6b1a-48ec-a9f4-a976673a6208","Type":"ContainerStarted","Data":"1247c1f125243c467fd6c5c4871847fc5bd9c2c4772f9df2c85f3882410c62db"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.219919 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" event={"ID":"83e036dd-6b1a-48ec-a9f4-a976673a6208","Type":"ContainerStarted","Data":"592e50c93e589da5412df7388373e65ebf8a92b12ad4bdfe866440f8b3ee5dea"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.229033 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" event={"ID":"d82019fe-0d36-4087-83db-41c03fa4fc66","Type":"ContainerStarted","Data":"f333d55613085cf065f0326ce3515bb88d1b269111e254b53a6bdee309dc10d8"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.229100 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" event={"ID":"d82019fe-0d36-4087-83db-41c03fa4fc66","Type":"ContainerStarted","Data":"206dbc7415f07b101faed205fca7d56e9c46ca95fa8268a4a85f25e579d05c59"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.230547 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.233298 4745 generic.go:334] "Generic (PLEG): container finished" podID="46576b1f-4646-44ba-a896-d509b05801cd" containerID="2b6247b74f321e3c9fe6e40c745a20e16808f6be322f361b906cd1acad47a45c" exitCode=0 Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.233467 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" event={"ID":"46576b1f-4646-44ba-a896-d509b05801cd","Type":"ContainerDied","Data":"2b6247b74f321e3c9fe6e40c745a20e16808f6be322f361b906cd1acad47a45c"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.233526 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" event={"ID":"46576b1f-4646-44ba-a896-d509b05801cd","Type":"ContainerStarted","Data":"024541ff85ddb6b0038e11579ec2a88508155e56b4fc962bb2fc758deb3c6ed9"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.237135 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" event={"ID":"660e3fac-6534-49e0-a81e-38971c9fec3f","Type":"ContainerStarted","Data":"def9e2832bd01bc34e82a756dd4b7c55d6daabcbcf53dab3ac06e388b10d54f0"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.237190 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" event={"ID":"660e3fac-6534-49e0-a81e-38971c9fec3f","Type":"ContainerStarted","Data":"c1e5ff3bf3ede6287e2aa318febf0edcffa1143f843a940ba2865e8265d8d20f"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.237204 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" event={"ID":"660e3fac-6534-49e0-a81e-38971c9fec3f","Type":"ContainerStarted","Data":"763d62f0cde3fdb647231e2cf21dddd2e9d6cc7c67dc8e556eda3ff5255126df"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.239733 4745 generic.go:334] "Generic (PLEG): container finished" podID="bee68b29-e3e7-4a15-9bda-981764261dcc" containerID="e7c145419018b8d03972266370c33562f5717fa3cfbdb424202a6f3c7a02a817" exitCode=0 Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.239867 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" event={"ID":"bee68b29-e3e7-4a15-9bda-981764261dcc","Type":"ContainerDied","Data":"e7c145419018b8d03972266370c33562f5717fa3cfbdb424202a6f3c7a02a817"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.240085 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" event={"ID":"bee68b29-e3e7-4a15-9bda-981764261dcc","Type":"ContainerStarted","Data":"77b4922577ab245ca2c7b782a0c40bdddaf3482c9e9f3098984860ebf3a3ba17"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.250006 4745 request.go:700] Waited for 1.932693634s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/serviceaccounts/etcd-operator/token Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.282232 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fknf6\" (UniqueName: \"kubernetes.io/projected/116f15d2-ff67-4a98-846a-29bd6a129bbd-kube-api-access-fknf6\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.289859 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rssq\" (UniqueName: \"kubernetes.io/projected/053b13b0-078a-45ea-a005-e38aab17b42f-kube-api-access-7rssq\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.318711 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsb2s\" (UniqueName: \"kubernetes.io/projected/19580e75-5123-4261-ac5c-96dbd7834613-kube-api-access-tsb2s\") pod \"openshift-config-operator-7777fb866f-5gsjw\" (UID: \"19580e75-5123-4261-ac5c-96dbd7834613\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.328507 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnckx\" (UniqueName: \"kubernetes.io/projected/88004414-de81-4e3c-9f3f-99f90a3bbc98-kube-api-access-lnckx\") pod \"image-pruner-29564640-xrq9h\" (UID: \"88004414-de81-4e3c-9f3f-99f90a3bbc98\") " pod="openshift-image-registry/image-pruner-29564640-xrq9h" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.357346 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88wjz\" (UniqueName: \"kubernetes.io/projected/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-kube-api-access-88wjz\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.366858 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.391160 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.398191 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab89302b-10a8-43fa-ad93-699274acaac3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8xctf\" (UID: \"ab89302b-10a8-43fa-ad93-699274acaac3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.398315 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzlgl\" (UniqueName: \"kubernetes.io/projected/199f2552-58de-4ea8-adf5-f1aee925f49b-kube-api-access-fzlgl\") pod \"openshift-controller-manager-operator-756b6f6bc6-74pfb\" (UID: \"199f2552-58de-4ea8-adf5-f1aee925f49b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.405232 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.412082 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.415092 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29564640-xrq9h" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.431854 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.451269 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.465513 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.468200 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.491154 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.492457 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2fgf\" (UniqueName: \"kubernetes.io/projected/38dd3b53-64de-4201-b427-0b1bc3e51849-kube-api-access-z2fgf\") pod \"downloads-7954f5f757-ljtrr\" (UID: \"38dd3b53-64de-4201-b427-0b1bc3e51849\") " pod="openshift-console/downloads-7954f5f757-ljtrr" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.524039 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5npf\" (UniqueName: \"kubernetes.io/projected/fd804c03-3021-44bd-8ce8-a10a482c59b4-kube-api-access-x5npf\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.574266 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.601734 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw6nw\" (UniqueName: \"kubernetes.io/projected/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-kube-api-access-cw6nw\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.603439 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdpdj\" (UniqueName: \"kubernetes.io/projected/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-kube-api-access-tdpdj\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.612120 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jnqh\" (UniqueName: \"kubernetes.io/projected/1cee17e3-a84a-42b3-8cbf-9e4dd2c76330-kube-api-access-9jnqh\") pod \"cluster-samples-operator-665b6dd947-m7nb5\" (UID: \"1cee17e3-a84a-42b3-8cbf-9e4dd2c76330\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.653830 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.683917 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ljtrr" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685112 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrgfx\" (UniqueName: \"kubernetes.io/projected/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-kube-api-access-qrgfx\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685159 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685206 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685227 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffa72fe8-3870-425b-81ff-cb17f6d3ac1c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z9bmr\" (UID: \"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685262 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b246ac53-9c42-426b-97da-3ca4075766ab-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685278 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c6b29d2c-9784-4f10-b0d1-09e88ddf5df1-metrics-tls\") pod \"dns-operator-744455d44c-j2mf5\" (UID: \"c6b29d2c-9784-4f10-b0d1-09e88ddf5df1\") " pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685295 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2bks\" (UniqueName: \"kubernetes.io/projected/4394045d-753a-4e2b-8ea5-7087d41481d2-kube-api-access-f2bks\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685330 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a51c238-7c2c-470f-8123-472327367ec8-metrics-tls\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685347 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb0a157b-0f6d-4738-ae67-e29407c2ba8e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fztjk\" (UID: \"cb0a157b-0f6d-4738-ae67-e29407c2ba8e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685453 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vlw2\" (UniqueName: \"kubernetes.io/projected/cb0a157b-0f6d-4738-ae67-e29407c2ba8e-kube-api-access-4vlw2\") pod \"control-plane-machine-set-operator-78cbb6b69f-fztjk\" (UID: \"cb0a157b-0f6d-4738-ae67-e29407c2ba8e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685537 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c970483-eb0d-49da-b0bc-d1685f8bf7f1-serving-cert\") pod \"service-ca-operator-777779d784-g2gxp\" (UID: \"5c970483-eb0d-49da-b0bc-d1685f8bf7f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.686074 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-trusted-ca\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.686099 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c970483-eb0d-49da-b0bc-d1685f8bf7f1-config\") pod \"service-ca-operator-777779d784-g2gxp\" (UID: \"5c970483-eb0d-49da-b0bc-d1685f8bf7f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.686172 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b246ac53-9c42-426b-97da-3ca4075766ab-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.686195 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4394045d-753a-4e2b-8ea5-7087d41481d2-apiservice-cert\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.686217 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qzmc\" (UniqueName: \"kubernetes.io/projected/5c970483-eb0d-49da-b0bc-d1685f8bf7f1-kube-api-access-9qzmc\") pod \"service-ca-operator-777779d784-g2gxp\" (UID: \"5c970483-eb0d-49da-b0bc-d1685f8bf7f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.686277 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhwtp\" (UniqueName: \"kubernetes.io/projected/e2cfb22a-6632-4e35-8145-6e9815e6e76f-kube-api-access-lhwtp\") pod \"marketplace-operator-79b997595-qn8c4\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.686305 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a51c238-7c2c-470f-8123-472327367ec8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.686394 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a51c238-7c2c-470f-8123-472327367ec8-trusted-ca\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.686948 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mp62\" (UniqueName: \"kubernetes.io/projected/c6b29d2c-9784-4f10-b0d1-09e88ddf5df1-kube-api-access-6mp62\") pod \"dns-operator-744455d44c-j2mf5\" (UID: \"c6b29d2c-9784-4f10-b0d1-09e88ddf5df1\") " pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.687264 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-registry-certificates\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.687292 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwkxl\" (UniqueName: \"kubernetes.io/projected/0a51c238-7c2c-470f-8123-472327367ec8-kube-api-access-vwkxl\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.687331 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4394045d-753a-4e2b-8ea5-7087d41481d2-webhook-cert\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.687349 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qn8c4\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.687380 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4394045d-753a-4e2b-8ea5-7087d41481d2-tmpfs\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.687397 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qn8c4\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.687413 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-registry-tls\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.688670 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggfhj\" (UniqueName: \"kubernetes.io/projected/5e57c8e3-d9a0-42bc-98d3-336656039e9c-kube-api-access-ggfhj\") pod \"service-ca-9c57cc56f-t99wg\" (UID: \"5e57c8e3-d9a0-42bc-98d3-336656039e9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.688777 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.688828 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c46c928-9116-4548-b20e-d9a66d439012-proxy-tls\") pod \"machine-config-controller-84d6567774-5g88t\" (UID: \"4c46c928-9116-4548-b20e-d9a66d439012\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.688848 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa72fe8-3870-425b-81ff-cb17f6d3ac1c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z9bmr\" (UID: \"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.689198 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6wxc\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-kube-api-access-z6wxc\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: E0319 00:10:41.689688 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:42.189674858 +0000 UTC m=+206.727869989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.690015 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c46c928-9116-4548-b20e-d9a66d439012-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5g88t\" (UID: \"4c46c928-9116-4548-b20e-d9a66d439012\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.690047 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffa72fe8-3870-425b-81ff-cb17f6d3ac1c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z9bmr\" (UID: \"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.690092 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.690132 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jx67\" (UniqueName: \"kubernetes.io/projected/4c46c928-9116-4548-b20e-d9a66d439012-kube-api-access-8jx67\") pod \"machine-config-controller-84d6567774-5g88t\" (UID: \"4c46c928-9116-4548-b20e-d9a66d439012\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.690214 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-bound-sa-token\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.690271 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5e57c8e3-d9a0-42bc-98d3-336656039e9c-signing-cabundle\") pod \"service-ca-9c57cc56f-t99wg\" (UID: \"5e57c8e3-d9a0-42bc-98d3-336656039e9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.691517 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5e57c8e3-d9a0-42bc-98d3-336656039e9c-signing-key\") pod \"service-ca-9c57cc56f-t99wg\" (UID: \"5e57c8e3-d9a0-42bc-98d3-336656039e9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.694308 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.762446 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.776339 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.784882 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.793465 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:41 crc kubenswrapper[4745]: E0319 00:10:41.793719 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:42.293680893 +0000 UTC m=+206.831876024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794218 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-proxy-tls\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794304 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vlw2\" (UniqueName: \"kubernetes.io/projected/cb0a157b-0f6d-4738-ae67-e29407c2ba8e-kube-api-access-4vlw2\") pod \"control-plane-machine-set-operator-78cbb6b69f-fztjk\" (UID: \"cb0a157b-0f6d-4738-ae67-e29407c2ba8e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794663 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75bf4c3d-1ce3-48df-8598-7f72667807c1-secret-volume\") pod \"collect-profiles-29564640-mn6rg\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794703 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed8d9c15-ca48-46f9-9368-36693ccdbe8b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kqwz5\" (UID: \"ed8d9c15-ca48-46f9-9368-36693ccdbe8b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794748 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c970483-eb0d-49da-b0bc-d1685f8bf7f1-serving-cert\") pod \"service-ca-operator-777779d784-g2gxp\" (UID: \"5c970483-eb0d-49da-b0bc-d1685f8bf7f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794782 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-registration-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794834 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e5beee30-fb62-4d40-91bd-c2f4b1efab1f-certs\") pod \"machine-config-server-8m25n\" (UID: \"e5beee30-fb62-4d40-91bd-c2f4b1efab1f\") " pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794869 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ead6910f-478b-45b5-b83c-06d3733635cb-metrics-tls\") pod \"dns-default-t28kd\" (UID: \"ead6910f-478b-45b5-b83c-06d3733635cb\") " pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794936 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75bf4c3d-1ce3-48df-8598-7f72667807c1-config-volume\") pod \"collect-profiles-29564640-mn6rg\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794966 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c970483-eb0d-49da-b0bc-d1685f8bf7f1-config\") pod \"service-ca-operator-777779d784-g2gxp\" (UID: \"5c970483-eb0d-49da-b0bc-d1685f8bf7f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794990 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mm5v\" (UniqueName: \"kubernetes.io/projected/e5beee30-fb62-4d40-91bd-c2f4b1efab1f-kube-api-access-5mm5v\") pod \"machine-config-server-8m25n\" (UID: \"e5beee30-fb62-4d40-91bd-c2f4b1efab1f\") " pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795027 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-trusted-ca\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795056 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt2wh\" (UniqueName: \"kubernetes.io/projected/4e2ca532-9e45-44d5-b541-3e9b34352d75-kube-api-access-jt2wh\") pod \"catalog-operator-68c6474976-wz97d\" (UID: \"4e2ca532-9e45-44d5-b541-3e9b34352d75\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795098 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b246ac53-9c42-426b-97da-3ca4075766ab-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795126 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4394045d-753a-4e2b-8ea5-7087d41481d2-apiservice-cert\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795185 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw6qq\" (UniqueName: \"kubernetes.io/projected/2b8f7226-6033-4b1b-bd4d-7c045b9d60ef-kube-api-access-jw6qq\") pod \"package-server-manager-789f6589d5-q4x24\" (UID: \"2b8f7226-6033-4b1b-bd4d-7c045b9d60ef\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795212 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qzmc\" (UniqueName: \"kubernetes.io/projected/5c970483-eb0d-49da-b0bc-d1685f8bf7f1-kube-api-access-9qzmc\") pod \"service-ca-operator-777779d784-g2gxp\" (UID: \"5c970483-eb0d-49da-b0bc-d1685f8bf7f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795260 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhwtp\" (UniqueName: \"kubernetes.io/projected/e2cfb22a-6632-4e35-8145-6e9815e6e76f-kube-api-access-lhwtp\") pod \"marketplace-operator-79b997595-qn8c4\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795287 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a51c238-7c2c-470f-8123-472327367ec8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795343 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-csi-data-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795371 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a51c238-7c2c-470f-8123-472327367ec8-trusted-ca\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795402 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mp62\" (UniqueName: \"kubernetes.io/projected/c6b29d2c-9784-4f10-b0d1-09e88ddf5df1-kube-api-access-6mp62\") pod \"dns-operator-744455d44c-j2mf5\" (UID: \"c6b29d2c-9784-4f10-b0d1-09e88ddf5df1\") " pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795445 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-registry-certificates\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795471 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwkxl\" (UniqueName: \"kubernetes.io/projected/0a51c238-7c2c-470f-8123-472327367ec8-kube-api-access-vwkxl\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.796837 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed8d9c15-ca48-46f9-9368-36693ccdbe8b-config\") pod \"kube-controller-manager-operator-78b949d7b-kqwz5\" (UID: \"ed8d9c15-ca48-46f9-9368-36693ccdbe8b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.796900 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4394045d-753a-4e2b-8ea5-7087d41481d2-webhook-cert\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.796929 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qn8c4\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.796957 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ead6910f-478b-45b5-b83c-06d3733635cb-config-volume\") pod \"dns-default-t28kd\" (UID: \"ead6910f-478b-45b5-b83c-06d3733635cb\") " pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.796996 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4394045d-753a-4e2b-8ea5-7087d41481d2-tmpfs\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797025 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qn8c4\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797056 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-socket-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797113 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-registry-tls\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797144 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj8wf\" (UniqueName: \"kubernetes.io/projected/10e7df4c-01da-48e6-8d5d-20e788cd4cdf-kube-api-access-wj8wf\") pod \"ingress-canary-m56ls\" (UID: \"10e7df4c-01da-48e6-8d5d-20e788cd4cdf\") " pod="openshift-ingress-canary/ingress-canary-m56ls" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797184 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggfhj\" (UniqueName: \"kubernetes.io/projected/5e57c8e3-d9a0-42bc-98d3-336656039e9c-kube-api-access-ggfhj\") pod \"service-ca-9c57cc56f-t99wg\" (UID: \"5e57c8e3-d9a0-42bc-98d3-336656039e9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797215 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e5beee30-fb62-4d40-91bd-c2f4b1efab1f-node-bootstrap-token\") pod \"machine-config-server-8m25n\" (UID: \"e5beee30-fb62-4d40-91bd-c2f4b1efab1f\") " pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797267 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10e7df4c-01da-48e6-8d5d-20e788cd4cdf-cert\") pod \"ingress-canary-m56ls\" (UID: \"10e7df4c-01da-48e6-8d5d-20e788cd4cdf\") " pod="openshift-ingress-canary/ingress-canary-m56ls" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797304 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797335 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5kr4\" (UniqueName: \"kubernetes.io/projected/d14d18f5-0177-4458-8ea3-b266cc96d658-kube-api-access-c5kr4\") pod \"auto-csr-approver-29564650-7k6ld\" (UID: \"d14d18f5-0177-4458-8ea3-b266cc96d658\") " pod="openshift-infra/auto-csr-approver-29564650-7k6ld" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797362 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98k68\" (UniqueName: \"kubernetes.io/projected/7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d-kube-api-access-98k68\") pod \"olm-operator-6b444d44fb-c2x4m\" (UID: \"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797394 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c46c928-9116-4548-b20e-d9a66d439012-proxy-tls\") pod \"machine-config-controller-84d6567774-5g88t\" (UID: \"4c46c928-9116-4548-b20e-d9a66d439012\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797421 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa72fe8-3870-425b-81ff-cb17f6d3ac1c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z9bmr\" (UID: \"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797449 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed8d9c15-ca48-46f9-9368-36693ccdbe8b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kqwz5\" (UID: \"ed8d9c15-ca48-46f9-9368-36693ccdbe8b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797499 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwv7s\" (UniqueName: \"kubernetes.io/projected/75bf4c3d-1ce3-48df-8598-7f72667807c1-kube-api-access-wwv7s\") pod \"collect-profiles-29564640-mn6rg\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797533 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcq9n\" (UniqueName: \"kubernetes.io/projected/4bf6d6d4-7566-4d1a-acc9-161b6b16f93d-kube-api-access-lcq9n\") pod \"migrator-59844c95c7-bpk8d\" (UID: \"4bf6d6d4-7566-4d1a-acc9-161b6b16f93d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797566 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6wxc\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-kube-api-access-z6wxc\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797595 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d-srv-cert\") pod \"olm-operator-6b444d44fb-c2x4m\" (UID: \"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797182 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c970483-eb0d-49da-b0bc-d1685f8bf7f1-config\") pod \"service-ca-operator-777779d784-g2gxp\" (UID: \"5c970483-eb0d-49da-b0bc-d1685f8bf7f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.801479 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c970483-eb0d-49da-b0bc-d1685f8bf7f1-serving-cert\") pod \"service-ca-operator-777779d784-g2gxp\" (UID: \"5c970483-eb0d-49da-b0bc-d1685f8bf7f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.804434 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-trusted-ca\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.805147 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4394045d-753a-4e2b-8ea5-7087d41481d2-apiservice-cert\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.805240 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/11217236-1702-4cd3-b097-e3e410cbbdb4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-v2dhg\" (UID: \"11217236-1702-4cd3-b097-e3e410cbbdb4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.816013 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29564640-xrq9h"] Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.816117 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vlw2\" (UniqueName: \"kubernetes.io/projected/cb0a157b-0f6d-4738-ae67-e29407c2ba8e-kube-api-access-4vlw2\") pod \"control-plane-machine-set-operator-78cbb6b69f-fztjk\" (UID: \"cb0a157b-0f6d-4738-ae67-e29407c2ba8e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.818741 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b246ac53-9c42-426b-97da-3ca4075766ab-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819028 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5pvv\" (UniqueName: \"kubernetes.io/projected/11217236-1702-4cd3-b097-e3e410cbbdb4-kube-api-access-d5pvv\") pod \"multus-admission-controller-857f4d67dd-v2dhg\" (UID: \"11217236-1702-4cd3-b097-e3e410cbbdb4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819081 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b8f7226-6033-4b1b-bd4d-7c045b9d60ef-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-q4x24\" (UID: \"2b8f7226-6033-4b1b-bd4d-7c045b9d60ef\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819106 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c46c928-9116-4548-b20e-d9a66d439012-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5g88t\" (UID: \"4c46c928-9116-4548-b20e-d9a66d439012\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819131 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffa72fe8-3870-425b-81ff-cb17f6d3ac1c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z9bmr\" (UID: \"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819160 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819248 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-images\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819270 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b00ac21-bd51-4aff-a8fc-d14d2b930940-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j59v\" (UID: \"9b00ac21-bd51-4aff-a8fc-d14d2b930940\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819293 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jx67\" (UniqueName: \"kubernetes.io/projected/4c46c928-9116-4548-b20e-d9a66d439012-kube-api-access-8jx67\") pod \"machine-config-controller-84d6567774-5g88t\" (UID: \"4c46c928-9116-4548-b20e-d9a66d439012\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819312 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2txg4\" (UniqueName: \"kubernetes.io/projected/b7b46d81-2c53-4021-be8a-f962c576a94c-kube-api-access-2txg4\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819350 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-bound-sa-token\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819748 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa72fe8-3870-425b-81ff-cb17f6d3ac1c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z9bmr\" (UID: \"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.820379 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a51c238-7c2c-470f-8123-472327367ec8-trusted-ca\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: E0319 00:10:41.820813 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:42.320795555 +0000 UTC m=+206.858990686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.820967 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5e57c8e3-d9a0-42bc-98d3-336656039e9c-signing-cabundle\") pod \"service-ca-9c57cc56f-t99wg\" (UID: \"5e57c8e3-d9a0-42bc-98d3-336656039e9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.821969 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4394045d-753a-4e2b-8ea5-7087d41481d2-tmpfs\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.822071 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c46c928-9116-4548-b20e-d9a66d439012-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5g88t\" (UID: \"4c46c928-9116-4548-b20e-d9a66d439012\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.822520 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptmgs\" (UniqueName: \"kubernetes.io/projected/ead6910f-478b-45b5-b83c-06d3733635cb-kube-api-access-ptmgs\") pod \"dns-default-t28kd\" (UID: \"ead6910f-478b-45b5-b83c-06d3733635cb\") " pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.822566 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b00ac21-bd51-4aff-a8fc-d14d2b930940-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j59v\" (UID: \"9b00ac21-bd51-4aff-a8fc-d14d2b930940\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.824330 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-registry-certificates\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.824997 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5e57c8e3-d9a0-42bc-98d3-336656039e9c-signing-cabundle\") pod \"service-ca-9c57cc56f-t99wg\" (UID: \"5e57c8e3-d9a0-42bc-98d3-336656039e9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.825169 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5e57c8e3-d9a0-42bc-98d3-336656039e9c-signing-key\") pod \"service-ca-9c57cc56f-t99wg\" (UID: \"5e57c8e3-d9a0-42bc-98d3-336656039e9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.825222 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7m7w\" (UniqueName: \"kubernetes.io/projected/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-kube-api-access-g7m7w\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.825644 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcpdx\" (UniqueName: \"kubernetes.io/projected/9b00ac21-bd51-4aff-a8fc-d14d2b930940-kube-api-access-fcpdx\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j59v\" (UID: \"9b00ac21-bd51-4aff-a8fc-d14d2b930940\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.825803 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-mountpoint-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.826525 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.826602 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrgfx\" (UniqueName: \"kubernetes.io/projected/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-kube-api-access-qrgfx\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.826649 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4e2ca532-9e45-44d5-b541-3e9b34352d75-profile-collector-cert\") pod \"catalog-operator-68c6474976-wz97d\" (UID: \"4e2ca532-9e45-44d5-b541-3e9b34352d75\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.826876 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.827027 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c2x4m\" (UID: \"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.827281 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.827328 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffa72fe8-3870-425b-81ff-cb17f6d3ac1c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z9bmr\" (UID: \"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.827351 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b246ac53-9c42-426b-97da-3ca4075766ab-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.827392 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c6b29d2c-9784-4f10-b0d1-09e88ddf5df1-metrics-tls\") pod \"dns-operator-744455d44c-j2mf5\" (UID: \"c6b29d2c-9784-4f10-b0d1-09e88ddf5df1\") " pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.829224 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qn8c4\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.829269 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-plugins-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.829305 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2bks\" (UniqueName: \"kubernetes.io/projected/4394045d-753a-4e2b-8ea5-7087d41481d2-kube-api-access-f2bks\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.829522 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a51c238-7c2c-470f-8123-472327367ec8-metrics-tls\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.829622 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb0a157b-0f6d-4738-ae67-e29407c2ba8e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fztjk\" (UID: \"cb0a157b-0f6d-4738-ae67-e29407c2ba8e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.829666 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4e2ca532-9e45-44d5-b541-3e9b34352d75-srv-cert\") pod \"catalog-operator-68c6474976-wz97d\" (UID: \"4e2ca532-9e45-44d5-b541-3e9b34352d75\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.835163 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.836004 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-registry-tls\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.837143 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.837354 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffa72fe8-3870-425b-81ff-cb17f6d3ac1c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z9bmr\" (UID: \"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.837418 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b246ac53-9c42-426b-97da-3ca4075766ab-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.837483 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a51c238-7c2c-470f-8123-472327367ec8-metrics-tls\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.838287 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5e57c8e3-d9a0-42bc-98d3-336656039e9c-signing-key\") pod \"service-ca-9c57cc56f-t99wg\" (UID: \"5e57c8e3-d9a0-42bc-98d3-336656039e9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.836702 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qn8c4\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.838970 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c6b29d2c-9784-4f10-b0d1-09e88ddf5df1-metrics-tls\") pod \"dns-operator-744455d44c-j2mf5\" (UID: \"c6b29d2c-9784-4f10-b0d1-09e88ddf5df1\") " pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.842753 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb0a157b-0f6d-4738-ae67-e29407c2ba8e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fztjk\" (UID: \"cb0a157b-0f6d-4738-ae67-e29407c2ba8e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.846670 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4394045d-753a-4e2b-8ea5-7087d41481d2-webhook-cert\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.849903 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c46c928-9116-4548-b20e-d9a66d439012-proxy-tls\") pod \"machine-config-controller-84d6567774-5g88t\" (UID: \"4c46c928-9116-4548-b20e-d9a66d439012\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.877384 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qzmc\" (UniqueName: \"kubernetes.io/projected/5c970483-eb0d-49da-b0bc-d1685f8bf7f1-kube-api-access-9qzmc\") pod \"service-ca-operator-777779d784-g2gxp\" (UID: \"5c970483-eb0d-49da-b0bc-d1685f8bf7f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.904949 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhwtp\" (UniqueName: \"kubernetes.io/projected/e2cfb22a-6632-4e35-8145-6e9815e6e76f-kube-api-access-lhwtp\") pod \"marketplace-operator-79b997595-qn8c4\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.930728 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.930943 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2txg4\" (UniqueName: \"kubernetes.io/projected/b7b46d81-2c53-4021-be8a-f962c576a94c-kube-api-access-2txg4\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931023 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b00ac21-bd51-4aff-a8fc-d14d2b930940-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j59v\" (UID: \"9b00ac21-bd51-4aff-a8fc-d14d2b930940\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931043 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptmgs\" (UniqueName: \"kubernetes.io/projected/ead6910f-478b-45b5-b83c-06d3733635cb-kube-api-access-ptmgs\") pod \"dns-default-t28kd\" (UID: \"ead6910f-478b-45b5-b83c-06d3733635cb\") " pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931062 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7m7w\" (UniqueName: \"kubernetes.io/projected/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-kube-api-access-g7m7w\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931080 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-mountpoint-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931100 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcpdx\" (UniqueName: \"kubernetes.io/projected/9b00ac21-bd51-4aff-a8fc-d14d2b930940-kube-api-access-fcpdx\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j59v\" (UID: \"9b00ac21-bd51-4aff-a8fc-d14d2b930940\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931124 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4e2ca532-9e45-44d5-b541-3e9b34352d75-profile-collector-cert\") pod \"catalog-operator-68c6474976-wz97d\" (UID: \"4e2ca532-9e45-44d5-b541-3e9b34352d75\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931139 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931161 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c2x4m\" (UID: \"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931189 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-plugins-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931212 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4e2ca532-9e45-44d5-b541-3e9b34352d75-srv-cert\") pod \"catalog-operator-68c6474976-wz97d\" (UID: \"4e2ca532-9e45-44d5-b541-3e9b34352d75\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931231 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-proxy-tls\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931248 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75bf4c3d-1ce3-48df-8598-7f72667807c1-secret-volume\") pod \"collect-profiles-29564640-mn6rg\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931272 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed8d9c15-ca48-46f9-9368-36693ccdbe8b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kqwz5\" (UID: \"ed8d9c15-ca48-46f9-9368-36693ccdbe8b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931289 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-registration-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931308 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e5beee30-fb62-4d40-91bd-c2f4b1efab1f-certs\") pod \"machine-config-server-8m25n\" (UID: \"e5beee30-fb62-4d40-91bd-c2f4b1efab1f\") " pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931325 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75bf4c3d-1ce3-48df-8598-7f72667807c1-config-volume\") pod \"collect-profiles-29564640-mn6rg\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931341 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ead6910f-478b-45b5-b83c-06d3733635cb-metrics-tls\") pod \"dns-default-t28kd\" (UID: \"ead6910f-478b-45b5-b83c-06d3733635cb\") " pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931359 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mm5v\" (UniqueName: \"kubernetes.io/projected/e5beee30-fb62-4d40-91bd-c2f4b1efab1f-kube-api-access-5mm5v\") pod \"machine-config-server-8m25n\" (UID: \"e5beee30-fb62-4d40-91bd-c2f4b1efab1f\") " pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931379 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt2wh\" (UniqueName: \"kubernetes.io/projected/4e2ca532-9e45-44d5-b541-3e9b34352d75-kube-api-access-jt2wh\") pod \"catalog-operator-68c6474976-wz97d\" (UID: \"4e2ca532-9e45-44d5-b541-3e9b34352d75\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931398 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw6qq\" (UniqueName: \"kubernetes.io/projected/2b8f7226-6033-4b1b-bd4d-7c045b9d60ef-kube-api-access-jw6qq\") pod \"package-server-manager-789f6589d5-q4x24\" (UID: \"2b8f7226-6033-4b1b-bd4d-7c045b9d60ef\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931430 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-csi-data-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931461 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed8d9c15-ca48-46f9-9368-36693ccdbe8b-config\") pod \"kube-controller-manager-operator-78b949d7b-kqwz5\" (UID: \"ed8d9c15-ca48-46f9-9368-36693ccdbe8b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931478 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ead6910f-478b-45b5-b83c-06d3733635cb-config-volume\") pod \"dns-default-t28kd\" (UID: \"ead6910f-478b-45b5-b83c-06d3733635cb\") " pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931493 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-socket-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931511 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj8wf\" (UniqueName: \"kubernetes.io/projected/10e7df4c-01da-48e6-8d5d-20e788cd4cdf-kube-api-access-wj8wf\") pod \"ingress-canary-m56ls\" (UID: \"10e7df4c-01da-48e6-8d5d-20e788cd4cdf\") " pod="openshift-ingress-canary/ingress-canary-m56ls" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931548 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e5beee30-fb62-4d40-91bd-c2f4b1efab1f-node-bootstrap-token\") pod \"machine-config-server-8m25n\" (UID: \"e5beee30-fb62-4d40-91bd-c2f4b1efab1f\") " pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931578 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10e7df4c-01da-48e6-8d5d-20e788cd4cdf-cert\") pod \"ingress-canary-m56ls\" (UID: \"10e7df4c-01da-48e6-8d5d-20e788cd4cdf\") " pod="openshift-ingress-canary/ingress-canary-m56ls" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931604 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5kr4\" (UniqueName: \"kubernetes.io/projected/d14d18f5-0177-4458-8ea3-b266cc96d658-kube-api-access-c5kr4\") pod \"auto-csr-approver-29564650-7k6ld\" (UID: \"d14d18f5-0177-4458-8ea3-b266cc96d658\") " pod="openshift-infra/auto-csr-approver-29564650-7k6ld" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931621 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed8d9c15-ca48-46f9-9368-36693ccdbe8b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kqwz5\" (UID: \"ed8d9c15-ca48-46f9-9368-36693ccdbe8b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931639 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwv7s\" (UniqueName: \"kubernetes.io/projected/75bf4c3d-1ce3-48df-8598-7f72667807c1-kube-api-access-wwv7s\") pod \"collect-profiles-29564640-mn6rg\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931657 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98k68\" (UniqueName: \"kubernetes.io/projected/7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d-kube-api-access-98k68\") pod \"olm-operator-6b444d44fb-c2x4m\" (UID: \"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931677 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcq9n\" (UniqueName: \"kubernetes.io/projected/4bf6d6d4-7566-4d1a-acc9-161b6b16f93d-kube-api-access-lcq9n\") pod \"migrator-59844c95c7-bpk8d\" (UID: \"4bf6d6d4-7566-4d1a-acc9-161b6b16f93d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931700 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d-srv-cert\") pod \"olm-operator-6b444d44fb-c2x4m\" (UID: \"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931716 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/11217236-1702-4cd3-b097-e3e410cbbdb4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-v2dhg\" (UID: \"11217236-1702-4cd3-b097-e3e410cbbdb4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931886 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b8f7226-6033-4b1b-bd4d-7c045b9d60ef-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-q4x24\" (UID: \"2b8f7226-6033-4b1b-bd4d-7c045b9d60ef\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931919 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5pvv\" (UniqueName: \"kubernetes.io/projected/11217236-1702-4cd3-b097-e3e410cbbdb4-kube-api-access-d5pvv\") pod \"multus-admission-controller-857f4d67dd-v2dhg\" (UID: \"11217236-1702-4cd3-b097-e3e410cbbdb4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931946 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-images\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931962 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b00ac21-bd51-4aff-a8fc-d14d2b930940-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j59v\" (UID: \"9b00ac21-bd51-4aff-a8fc-d14d2b930940\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.932845 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-csi-data-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.933557 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed8d9c15-ca48-46f9-9368-36693ccdbe8b-config\") pod \"kube-controller-manager-operator-78b949d7b-kqwz5\" (UID: \"ed8d9c15-ca48-46f9-9368-36693ccdbe8b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.933884 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-images\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.934497 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-socket-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: E0319 00:10:41.934812 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:42.434791091 +0000 UTC m=+206.972986232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.935315 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-mountpoint-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.938821 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-plugins-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.939235 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ead6910f-478b-45b5-b83c-06d3733635cb-config-volume\") pod \"dns-default-t28kd\" (UID: \"ead6910f-478b-45b5-b83c-06d3733635cb\") " pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.939490 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b00ac21-bd51-4aff-a8fc-d14d2b930940-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j59v\" (UID: \"9b00ac21-bd51-4aff-a8fc-d14d2b930940\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.939960 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggfhj\" (UniqueName: \"kubernetes.io/projected/5e57c8e3-d9a0-42bc-98d3-336656039e9c-kube-api-access-ggfhj\") pod \"service-ca-9c57cc56f-t99wg\" (UID: \"5e57c8e3-d9a0-42bc-98d3-336656039e9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.949906 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b00ac21-bd51-4aff-a8fc-d14d2b930940-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j59v\" (UID: \"9b00ac21-bd51-4aff-a8fc-d14d2b930940\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.949992 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-registration-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.950493 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.956623 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75bf4c3d-1ce3-48df-8598-7f72667807c1-config-volume\") pod \"collect-profiles-29564640-mn6rg\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.968517 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d-srv-cert\") pod \"olm-operator-6b444d44fb-c2x4m\" (UID: \"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.969765 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ssbjs"] Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.970395 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b8f7226-6033-4b1b-bd4d-7c045b9d60ef-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-q4x24\" (UID: \"2b8f7226-6033-4b1b-bd4d-7c045b9d60ef\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.973038 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4e2ca532-9e45-44d5-b541-3e9b34352d75-profile-collector-cert\") pod \"catalog-operator-68c6474976-wz97d\" (UID: \"4e2ca532-9e45-44d5-b541-3e9b34352d75\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.973075 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c2x4m\" (UID: \"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.973441 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw"] Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.973858 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6wxc\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-kube-api-access-z6wxc\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.974276 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/11217236-1702-4cd3-b097-e3e410cbbdb4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-v2dhg\" (UID: \"11217236-1702-4cd3-b097-e3e410cbbdb4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.976850 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4e2ca532-9e45-44d5-b541-3e9b34352d75-srv-cert\") pod \"catalog-operator-68c6474976-wz97d\" (UID: \"4e2ca532-9e45-44d5-b541-3e9b34352d75\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.977289 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ead6910f-478b-45b5-b83c-06d3733635cb-metrics-tls\") pod \"dns-default-t28kd\" (UID: \"ead6910f-478b-45b5-b83c-06d3733635cb\") " pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.977991 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-proxy-tls\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.978949 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed8d9c15-ca48-46f9-9368-36693ccdbe8b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kqwz5\" (UID: \"ed8d9c15-ca48-46f9-9368-36693ccdbe8b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.981938 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10e7df4c-01da-48e6-8d5d-20e788cd4cdf-cert\") pod \"ingress-canary-m56ls\" (UID: \"10e7df4c-01da-48e6-8d5d-20e788cd4cdf\") " pod="openshift-ingress-canary/ingress-canary-m56ls" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.982305 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e5beee30-fb62-4d40-91bd-c2f4b1efab1f-certs\") pod \"machine-config-server-8m25n\" (UID: \"e5beee30-fb62-4d40-91bd-c2f4b1efab1f\") " pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.982504 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75bf4c3d-1ce3-48df-8598-7f72667807c1-secret-volume\") pod \"collect-profiles-29564640-mn6rg\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.982531 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a51c238-7c2c-470f-8123-472327367ec8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.983436 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e5beee30-fb62-4d40-91bd-c2f4b1efab1f-node-bootstrap-token\") pod \"machine-config-server-8m25n\" (UID: \"e5beee30-fb62-4d40-91bd-c2f4b1efab1f\") " pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.992281 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffa72fe8-3870-425b-81ff-cb17f6d3ac1c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z9bmr\" (UID: \"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.012515 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jx67\" (UniqueName: \"kubernetes.io/projected/4c46c928-9116-4548-b20e-d9a66d439012-kube-api-access-8jx67\") pod \"machine-config-controller-84d6567774-5g88t\" (UID: \"4c46c928-9116-4548-b20e-d9a66d439012\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.018229 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-bound-sa-token\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.032660 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.033037 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:42.533023865 +0000 UTC m=+207.071218996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.041481 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mp62\" (UniqueName: \"kubernetes.io/projected/c6b29d2c-9784-4f10-b0d1-09e88ddf5df1-kube-api-access-6mp62\") pod \"dns-operator-744455d44c-j2mf5\" (UID: \"c6b29d2c-9784-4f10-b0d1-09e88ddf5df1\") " pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.056129 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwkxl\" (UniqueName: \"kubernetes.io/projected/0a51c238-7c2c-470f-8123-472327367ec8-kube-api-access-vwkxl\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.090069 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrgfx\" (UniqueName: \"kubernetes.io/projected/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-kube-api-access-qrgfx\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.098534 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.104495 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.110314 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.118796 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2bks\" (UniqueName: \"kubernetes.io/projected/4394045d-753a-4e2b-8ea5-7087d41481d2-kube-api-access-f2bks\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.119109 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.128113 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.133281 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.133755 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:42.63373558 +0000 UTC m=+207.171930711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.138506 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.147014 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.156593 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.163486 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5kr4\" (UniqueName: \"kubernetes.io/projected/d14d18f5-0177-4458-8ea3-b266cc96d658-kube-api-access-c5kr4\") pod \"auto-csr-approver-29564650-7k6ld\" (UID: \"d14d18f5-0177-4458-8ea3-b266cc96d658\") " pod="openshift-infra/auto-csr-approver-29564650-7k6ld" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.181879 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.182534 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.188912 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt2wh\" (UniqueName: \"kubernetes.io/projected/4e2ca532-9e45-44d5-b541-3e9b34352d75-kube-api-access-jt2wh\") pod \"catalog-operator-68c6474976-wz97d\" (UID: \"4e2ca532-9e45-44d5-b541-3e9b34352d75\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.193642 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf"] Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.202037 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw6qq\" (UniqueName: \"kubernetes.io/projected/2b8f7226-6033-4b1b-bd4d-7c045b9d60ef-kube-api-access-jw6qq\") pod \"package-server-manager-789f6589d5-q4x24\" (UID: \"2b8f7226-6033-4b1b-bd4d-7c045b9d60ef\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.210449 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5pvv\" (UniqueName: \"kubernetes.io/projected/11217236-1702-4cd3-b097-e3e410cbbdb4-kube-api-access-d5pvv\") pod \"multus-admission-controller-857f4d67dd-v2dhg\" (UID: \"11217236-1702-4cd3-b097-e3e410cbbdb4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.212761 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.227377 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.234567 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.235028 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:42.735008115 +0000 UTC m=+207.273203246 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.235805 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj8wf\" (UniqueName: \"kubernetes.io/projected/10e7df4c-01da-48e6-8d5d-20e788cd4cdf-kube-api-access-wj8wf\") pod \"ingress-canary-m56ls\" (UID: \"10e7df4c-01da-48e6-8d5d-20e788cd4cdf\") " pod="openshift-ingress-canary/ingress-canary-m56ls" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.272625 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.274211 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2txg4\" (UniqueName: \"kubernetes.io/projected/b7b46d81-2c53-4021-be8a-f962c576a94c-kube-api-access-2txg4\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.274775 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ssbjs" event={"ID":"053b13b0-078a-45ea-a005-e38aab17b42f","Type":"ContainerStarted","Data":"4599ea118ed41967dec054d250458ddd9d4446dbf5e68fcd9b9416f57bb90d83"} Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.284657 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.296458 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptmgs\" (UniqueName: \"kubernetes.io/projected/ead6910f-478b-45b5-b83c-06d3733635cb-kube-api-access-ptmgs\") pod \"dns-default-t28kd\" (UID: \"ead6910f-478b-45b5-b83c-06d3733635cb\") " pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.297695 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.299148 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jrq7v" event={"ID":"5c3bf4d2-ad08-42d0-bd92-f94074fc4833","Type":"ContainerStarted","Data":"0fce18da0d381d684e8f3380652feea36745ca4b26730b86dc8d697b920e6fe4"} Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.299275 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7m7w\" (UniqueName: \"kubernetes.io/projected/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-kube-api-access-g7m7w\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.306587 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.313622 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" event={"ID":"ab89302b-10a8-43fa-ad93-699274acaac3","Type":"ContainerStarted","Data":"d338496d01871358c0d340b96464ff899bf3fd9be588fe946dcb971739923471"} Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.337750 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c8nmg"] Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.337824 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m56ls" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.338259 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.338553 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:42.838527524 +0000 UTC m=+207.376722665 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.346348 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.346861 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:42.84684622 +0000 UTC m=+207.385041351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.348170 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" event={"ID":"19580e75-5123-4261-ac5c-96dbd7834613","Type":"ContainerStarted","Data":"d6f1707d3a61337a62dd4d3650b8c4ac2606e8c3177922a6463e575e12c28609"} Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.359274 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcpdx\" (UniqueName: \"kubernetes.io/projected/9b00ac21-bd51-4aff-a8fc-d14d2b930940-kube-api-access-fcpdx\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j59v\" (UID: \"9b00ac21-bd51-4aff-a8fc-d14d2b930940\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.361425 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mm5v\" (UniqueName: \"kubernetes.io/projected/e5beee30-fb62-4d40-91bd-c2f4b1efab1f-kube-api-access-5mm5v\") pod \"machine-config-server-8m25n\" (UID: \"e5beee30-fb62-4d40-91bd-c2f4b1efab1f\") " pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.364609 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98k68\" (UniqueName: \"kubernetes.io/projected/7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d-kube-api-access-98k68\") pod \"olm-operator-6b444d44fb-c2x4m\" (UID: \"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.376282 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" event={"ID":"46576b1f-4646-44ba-a896-d509b05801cd","Type":"ContainerStarted","Data":"87f903fbf801f35514644058e6b9df1eec3cf9b7864d28e34bc2e418327afbf2"} Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.386089 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" event={"ID":"a43287ca-c4a1-424a-86b5-f7f4f1a627a9","Type":"ContainerStarted","Data":"84a1f23b1aa1951875f5b0e20abfba39aa7ee93cb5d48a1278af51ebd780e196"} Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.386414 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ljtrr"] Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.386689 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed8d9c15-ca48-46f9-9368-36693ccdbe8b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kqwz5\" (UID: \"ed8d9c15-ca48-46f9-9368-36693ccdbe8b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.402172 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29564640-xrq9h" event={"ID":"88004414-de81-4e3c-9f3f-99f90a3bbc98","Type":"ContainerStarted","Data":"f164174bebded768a6d9e6df2a3ae9216824193f2bf4ab662809fd13bd0bfb0d"} Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.402826 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.429573 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwv7s\" (UniqueName: \"kubernetes.io/projected/75bf4c3d-1ce3-48df-8598-7f72667807c1-kube-api-access-wwv7s\") pod \"collect-profiles-29564640-mn6rg\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.434789 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" event={"ID":"bee68b29-e3e7-4a15-9bda-981764261dcc","Type":"ContainerStarted","Data":"9cab78c65b3c450b670b6aa33c7bdd5ac404af8d1d3efaf0f8f02962c1a60760"} Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.443682 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-522nc"] Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.447566 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.448820 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:42.948797576 +0000 UTC m=+207.486992707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.448856 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcq9n\" (UniqueName: \"kubernetes.io/projected/4bf6d6d4-7566-4d1a-acc9-161b6b16f93d-kube-api-access-lcq9n\") pod \"migrator-59844c95c7-bpk8d\" (UID: \"4bf6d6d4-7566-4d1a-acc9-161b6b16f93d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.459205 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.479078 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5"] Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.481628 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fpxzh"] Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.481679 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb"] Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.495166 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.534184 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.543325 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.550747 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.555603 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.556336 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.056306008 +0000 UTC m=+207.594501139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.559606 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.563230 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.598401 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.652504 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.652841 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.152824404 +0000 UTC m=+207.691019535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: W0319 00:10:42.706389 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd804c03_3021_44bd_8ce8_a10a482c59b4.slice/crio-c44856eba25dffd0a3b3bc92a50fcc693f803312f2a13c32d00a20a42a8a96d1 WatchSource:0}: Error finding container c44856eba25dffd0a3b3bc92a50fcc693f803312f2a13c32d00a20a42a8a96d1: Status 404 returned error can't find the container with id c44856eba25dffd0a3b3bc92a50fcc693f803312f2a13c32d00a20a42a8a96d1 Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.754022 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.754477 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.254461171 +0000 UTC m=+207.792656302 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.756603 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" podStartSLOduration=168.756582891 podStartE2EDuration="2m48.756582891s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:42.709127375 +0000 UTC m=+207.247322506" watchObservedRunningTime="2026-03-19 00:10:42.756582891 +0000 UTC m=+207.294778022" Mar 19 00:10:42 crc kubenswrapper[4745]: W0319 00:10:42.778215 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod199f2552_58de_4ea8_adf5_f1aee925f49b.slice/crio-a7e8a76de0a990e17164cd5f8f309b47863340c2c5792ac20c6305ad85bbd642 WatchSource:0}: Error finding container a7e8a76de0a990e17164cd5f8f309b47863340c2c5792ac20c6305ad85bbd642: Status 404 returned error can't find the container with id a7e8a76de0a990e17164cd5f8f309b47863340c2c5792ac20c6305ad85bbd642 Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.847808 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qn8c4"] Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.855031 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.355010621 +0000 UTC m=+207.893205752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.855066 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.855382 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.856095 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.356068417 +0000 UTC m=+207.894263548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.865503 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" podStartSLOduration=168.865485249 podStartE2EDuration="2m48.865485249s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:42.86399985 +0000 UTC m=+207.402194991" watchObservedRunningTime="2026-03-19 00:10:42.865485249 +0000 UTC m=+207.403680380" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.958376 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.959158 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.45913181 +0000 UTC m=+207.997326941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.061105 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:43 crc kubenswrapper[4745]: E0319 00:10:43.061491 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.56147881 +0000 UTC m=+208.099673931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.163109 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:43 crc kubenswrapper[4745]: E0319 00:10:43.163516 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.663494899 +0000 UTC m=+208.201690040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.230051 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" podStartSLOduration=169.230028809 podStartE2EDuration="2m49.230028809s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:43.223005196 +0000 UTC m=+207.761200347" watchObservedRunningTime="2026-03-19 00:10:43.230028809 +0000 UTC m=+207.768223960" Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.264664 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:43 crc kubenswrapper[4745]: E0319 00:10:43.265041 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.765028212 +0000 UTC m=+208.303223343 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.288529 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" podStartSLOduration=169.288509922 podStartE2EDuration="2m49.288509922s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:43.288236873 +0000 UTC m=+207.826432004" watchObservedRunningTime="2026-03-19 00:10:43.288509922 +0000 UTC m=+207.826705053" Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.366411 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:43 crc kubenswrapper[4745]: E0319 00:10:43.367160 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.867144694 +0000 UTC m=+208.405339825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.435325 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" podStartSLOduration=169.435303709 podStartE2EDuration="2m49.435303709s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:43.370807356 +0000 UTC m=+207.909002487" watchObservedRunningTime="2026-03-19 00:10:43.435303709 +0000 UTC m=+207.973498840" Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.483114 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:43 crc kubenswrapper[4745]: E0319 00:10:43.483415 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.983404037 +0000 UTC m=+208.521599168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.502185 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8m25n" event={"ID":"e5beee30-fb62-4d40-91bd-c2f4b1efab1f","Type":"ContainerStarted","Data":"0fc595965cbda51556e2bcb7121dc07342283b1b7d2cc42a9b005cfc92033228"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.504811 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jrq7v" event={"ID":"5c3bf4d2-ad08-42d0-bd92-f94074fc4833","Type":"ContainerStarted","Data":"b836bf30cafe76b55123bd7ef9a2c3b608c7ecbd64d73eee0d014dac33a102ec"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.518532 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ssbjs" event={"ID":"053b13b0-078a-45ea-a005-e38aab17b42f","Type":"ContainerStarted","Data":"407356ac1a1db0d3eb06d4b2891c2b80f16c2f7708e3ec79c43354714f8e98f1"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.544054 4745 generic.go:334] "Generic (PLEG): container finished" podID="19580e75-5123-4261-ac5c-96dbd7834613" containerID="13fe52116b9b431e2966593c12d825d4267101d61b43a3ee6a2395733445a5b4" exitCode=0 Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.545024 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" event={"ID":"19580e75-5123-4261-ac5c-96dbd7834613","Type":"ContainerDied","Data":"13fe52116b9b431e2966593c12d825d4267101d61b43a3ee6a2395733445a5b4"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.553371 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" event={"ID":"1cee17e3-a84a-42b3-8cbf-9e4dd2c76330","Type":"ContainerStarted","Data":"2a61cdef8a7ab2eb222650aab751fecc0381114e26cb753d2ce9e33592c1ea4e"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.591045 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" event={"ID":"e2cfb22a-6632-4e35-8145-6e9815e6e76f","Type":"ContainerStarted","Data":"1ba6e1f38cebd60b48f169bf9b16cb68b35cbd4232e7b7482a4e5339486334e0"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.602965 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" event={"ID":"199f2552-58de-4ea8-adf5-f1aee925f49b","Type":"ContainerStarted","Data":"a7e8a76de0a990e17164cd5f8f309b47863340c2c5792ac20c6305ad85bbd642"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.605563 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ljtrr" event={"ID":"38dd3b53-64de-4201-b427-0b1bc3e51849","Type":"ContainerStarted","Data":"06a819ea983bf97ea0acfffb68d054ca1811dc7ddd4281a8c5ae4c1f7282ba1b"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.612061 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:43 crc kubenswrapper[4745]: E0319 00:10:43.615682 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:44.11564924 +0000 UTC m=+208.653844381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.626264 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:43 crc kubenswrapper[4745]: E0319 00:10:43.628180 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:44.128166736 +0000 UTC m=+208.666361867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.647917 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" event={"ID":"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4","Type":"ContainerStarted","Data":"6ed318766e28a7953b973ede1884a1baaf2b2983a95f1b27491abc323fc1f4ae"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.698100 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" event={"ID":"a43287ca-c4a1-424a-86b5-f7f4f1a627a9","Type":"ContainerStarted","Data":"4e1ecca63a7c08df02a52f04baf81555a44dbe0946e6423e828e3df2518bba61"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.712573 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29564640-xrq9h" event={"ID":"88004414-de81-4e3c-9f3f-99f90a3bbc98","Type":"ContainerStarted","Data":"f0c4e6e413094127a3df81fb923d127a9cc66b65e1e4d1cf289b3133f8a3d81a"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.729838 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:43 crc kubenswrapper[4745]: E0319 00:10:43.746025 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:44.24598521 +0000 UTC m=+208.784180341 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.754998 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fpxzh" event={"ID":"fd804c03-3021-44bd-8ce8-a10a482c59b4","Type":"ContainerStarted","Data":"c44856eba25dffd0a3b3bc92a50fcc693f803312f2a13c32d00a20a42a8a96d1"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.774058 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" event={"ID":"116f15d2-ff67-4a98-846a-29bd6a129bbd","Type":"ContainerStarted","Data":"0b1f554776336ed65fdb0dd1df2acef087f9af61751e65e93f4ff9e5891a6202"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.790585 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.849740 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:43 crc kubenswrapper[4745]: E0319 00:10:43.852084 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:44.352069664 +0000 UTC m=+208.890264795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.926747 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" podStartSLOduration=169.926731334 podStartE2EDuration="2m49.926731334s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:43.926334261 +0000 UTC m=+208.464529402" watchObservedRunningTime="2026-03-19 00:10:43.926731334 +0000 UTC m=+208.464926465" Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.951382 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:43 crc kubenswrapper[4745]: E0319 00:10:43.952216 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:44.45219836 +0000 UTC m=+208.990393481 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.954478 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29564640-xrq9h" podStartSLOduration=169.954456866 podStartE2EDuration="2m49.954456866s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:43.952023744 +0000 UTC m=+208.490218875" watchObservedRunningTime="2026-03-19 00:10:43.954456866 +0000 UTC m=+208.492651997" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.019966 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:44 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:44 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:44 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.020697 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.054055 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:44 crc kubenswrapper[4745]: E0319 00:10:44.054515 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:44.554502239 +0000 UTC m=+209.092697370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.157521 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:44 crc kubenswrapper[4745]: E0319 00:10:44.158006 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:44.657988797 +0000 UTC m=+209.196183928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.266041 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:44 crc kubenswrapper[4745]: E0319 00:10:44.266632 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:44.766620035 +0000 UTC m=+209.304815166 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.332633 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t99wg"] Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.353004 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t"] Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.353289 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-jrq7v" podStartSLOduration=170.353259543 podStartE2EDuration="2m50.353259543s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:44.340396046 +0000 UTC m=+208.878591177" watchObservedRunningTime="2026-03-19 00:10:44.353259543 +0000 UTC m=+208.891454674" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.369416 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:44 crc kubenswrapper[4745]: E0319 00:10:44.369757 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:44.86972943 +0000 UTC m=+209.407924561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.389156 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-ssbjs" podStartSLOduration=170.389128375 podStartE2EDuration="2m50.389128375s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:44.385379461 +0000 UTC m=+208.923574592" watchObservedRunningTime="2026-03-19 00:10:44.389128375 +0000 UTC m=+208.927323506" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.401584 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr"] Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.472657 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:44 crc kubenswrapper[4745]: E0319 00:10:44.472992 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:44.97297964 +0000 UTC m=+209.511174761 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.574190 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:44 crc kubenswrapper[4745]: E0319 00:10:44.574872 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:45.074856075 +0000 UTC m=+209.613051206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.676702 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:44 crc kubenswrapper[4745]: E0319 00:10:44.677021 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:45.177009858 +0000 UTC m=+209.715204989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.782349 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:44 crc kubenswrapper[4745]: E0319 00:10:44.783249 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:45.283233167 +0000 UTC m=+209.821428288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.804080 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:44 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:44 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:44 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.804128 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.846996 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" event={"ID":"46576b1f-4646-44ba-a896-d509b05801cd","Type":"ContainerStarted","Data":"78985eeb1b6cbf5102aef43d77d961e7298828c776df768c5bf0697baf2864d4"} Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.857627 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" event={"ID":"19580e75-5123-4261-ac5c-96dbd7834613","Type":"ContainerStarted","Data":"87339a16f298d1a4a72e4572fa9f1ebb981ab0ccb8ac47e278cc8ed00c9086ef"} Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.858921 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.860142 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" event={"ID":"5e57c8e3-d9a0-42bc-98d3-336656039e9c","Type":"ContainerStarted","Data":"904d5df0701de0ed53d5eb62af3fa4561c590e320e15e70fe544c6dd7e0f874d"} Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.861126 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fpxzh" event={"ID":"fd804c03-3021-44bd-8ce8-a10a482c59b4","Type":"ContainerStarted","Data":"b64204c9abae73978603b4fbd4fb335833fc1e0d343bc43acc9a2faa6daab5ad"} Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.863765 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.866731 4745 patch_prober.go:28] interesting pod/console-operator-58897d9998-fpxzh container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.866869 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-fpxzh" podUID="fd804c03-3021-44bd-8ce8-a10a482c59b4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.885322 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.889003 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" podStartSLOduration=170.888985431 podStartE2EDuration="2m50.888985431s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:44.881649196 +0000 UTC m=+209.419844327" watchObservedRunningTime="2026-03-19 00:10:44.888985431 +0000 UTC m=+209.427180562" Mar 19 00:10:44 crc kubenswrapper[4745]: E0319 00:10:44.889537 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:45.389525239 +0000 UTC m=+209.927720370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.890403 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" event={"ID":"ab89302b-10a8-43fa-ad93-699274acaac3","Type":"ContainerStarted","Data":"4c72380c747f9ba0747e7bd063d99600527ad18ace1c01c5066ad909b432c333"} Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.892919 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" event={"ID":"199f2552-58de-4ea8-adf5-f1aee925f49b","Type":"ContainerStarted","Data":"b73ccaa8f9fc933966f7bc38763ad254954a0421577b83726cbe7c1e025ed15a"} Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.896499 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ljtrr" event={"ID":"38dd3b53-64de-4201-b427-0b1bc3e51849","Type":"ContainerStarted","Data":"4892f51e711725e1250e3b1141628fa13136d81a6a085ea386d07481683c75e7"} Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.897944 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-ljtrr" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.900902 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-ljtrr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.900935 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ljtrr" podUID="38dd3b53-64de-4201-b427-0b1bc3e51849" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.921822 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-fpxzh" podStartSLOduration=170.921806701 podStartE2EDuration="2m50.921806701s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:44.921515301 +0000 UTC m=+209.459710432" watchObservedRunningTime="2026-03-19 00:10:44.921806701 +0000 UTC m=+209.460001832" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.922330 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" event={"ID":"116f15d2-ff67-4a98-846a-29bd6a129bbd","Type":"ContainerStarted","Data":"cc68f7725174dd71269b9d76da0bec440434f16b38af3efc1226f021a7ab8035"} Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.950238 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" event={"ID":"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c","Type":"ContainerStarted","Data":"6ef5b654b82c6cb0ee18ace7ea52f6949cfabf5e28bdeb589877987821497a27"} Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.966963 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" podStartSLOduration=170.96694508 podStartE2EDuration="2m50.96694508s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:44.965590685 +0000 UTC m=+209.503785826" watchObservedRunningTime="2026-03-19 00:10:44.96694508 +0000 UTC m=+209.505140211" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.985855 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:44 crc kubenswrapper[4745]: E0319 00:10:44.987368 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:45.487352769 +0000 UTC m=+210.025547900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.004422 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" event={"ID":"a43287ca-c4a1-424a-86b5-f7f4f1a627a9","Type":"ContainerStarted","Data":"0a5dc6241dc9c5f25fea3eba32cc7248c7e5aab6abf8a2997db181e1200f9d0b"} Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.014305 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" event={"ID":"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4","Type":"ContainerStarted","Data":"5e97a389fffe604d654de61eb43dad886ba3f0e357ed64056f25bb2e0ae3b360"} Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.014709 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.015979 4745 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-522nc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.34:6443/healthz\": dial tcp 10.217.0.34:6443: connect: connection refused" start-of-body= Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.017221 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" podUID="d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.34:6443/healthz\": dial tcp 10.217.0.34:6443: connect: connection refused" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.038763 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" event={"ID":"1cee17e3-a84a-42b3-8cbf-9e4dd2c76330","Type":"ContainerStarted","Data":"cb7d6e63f2def1fa1c9273f764b0371b6595ee5b9e0575c36e1c68a6e225e24a"} Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.038830 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" event={"ID":"1cee17e3-a84a-42b3-8cbf-9e4dd2c76330","Type":"ContainerStarted","Data":"46d4bd55d82782d41dccb95fb2c4f3eec6a6684e7d20a824ed68bfabd6ca757a"} Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.042737 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" podStartSLOduration=171.042717087 podStartE2EDuration="2m51.042717087s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:44.992204099 +0000 UTC m=+209.530399230" watchObservedRunningTime="2026-03-19 00:10:45.042717087 +0000 UTC m=+209.580912218" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.070580 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" event={"ID":"e2cfb22a-6632-4e35-8145-6e9815e6e76f","Type":"ContainerStarted","Data":"afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8"} Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.074106 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.082996 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" podStartSLOduration=171.082960724 podStartE2EDuration="2m51.082960724s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:45.042654205 +0000 UTC m=+209.580849336" watchObservedRunningTime="2026-03-19 00:10:45.082960724 +0000 UTC m=+209.621155855" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.084197 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" podStartSLOduration=171.084190985 podStartE2EDuration="2m51.084190985s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:45.072535288 +0000 UTC m=+209.610730419" watchObservedRunningTime="2026-03-19 00:10:45.084190985 +0000 UTC m=+209.622386116" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.091031 4745 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qn8c4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.091091 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" podUID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.092507 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.096641 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:45.596624569 +0000 UTC m=+210.134819700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.111737 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8m25n" event={"ID":"e5beee30-fb62-4d40-91bd-c2f4b1efab1f","Type":"ContainerStarted","Data":"35405b1a963706967dee2af7c746b2e7281fcebdd54b9c8264435060fbf73e5d"} Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.114022 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-ljtrr" podStartSLOduration=171.114005375 podStartE2EDuration="2m51.114005375s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:45.112394812 +0000 UTC m=+209.650589953" watchObservedRunningTime="2026-03-19 00:10:45.114005375 +0000 UTC m=+209.652200506" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.137759 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" podStartSLOduration=171.137732634 podStartE2EDuration="2m51.137732634s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:45.135597513 +0000 UTC m=+209.673792644" watchObservedRunningTime="2026-03-19 00:10:45.137732634 +0000 UTC m=+209.675927765" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.139806 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" event={"ID":"4c46c928-9116-4548-b20e-d9a66d439012","Type":"ContainerStarted","Data":"d21fb403e1f1a1258f05dcaeab94eb72788193cd8b1cb8f7a5539cbd0770a04d"} Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.178460 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" podStartSLOduration=171.178433535 podStartE2EDuration="2m51.178433535s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:45.17795724 +0000 UTC m=+209.716152381" watchObservedRunningTime="2026-03-19 00:10:45.178433535 +0000 UTC m=+209.716628666" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.184107 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh"] Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.195688 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:45.695659528 +0000 UTC m=+210.233854659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.195826 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.197810 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.201015 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:45.701003895 +0000 UTC m=+210.239199026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.210221 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564650-7k6ld"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.234498 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8m25n" podStartSLOduration=6.234475197 podStartE2EDuration="6.234475197s" podCreationTimestamp="2026-03-19 00:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:45.230797186 +0000 UTC m=+209.768992317" watchObservedRunningTime="2026-03-19 00:10:45.234475197 +0000 UTC m=+209.772670318" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.242653 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.254589 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.299654 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.302763 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:45.802712424 +0000 UTC m=+210.340907555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.329166 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.329971 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.357998 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.369785 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" podStartSLOduration=171.369758122 podStartE2EDuration="2m51.369758122s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:45.285362988 +0000 UTC m=+209.823558129" watchObservedRunningTime="2026-03-19 00:10:45.369758122 +0000 UTC m=+209.907953283" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.389426 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" podStartSLOduration=171.389396724 podStartE2EDuration="2m51.389396724s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:45.329754413 +0000 UTC m=+209.867949544" watchObservedRunningTime="2026-03-19 00:10:45.389396724 +0000 UTC m=+209.927591855" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.399831 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.402865 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.403250 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:45.903237204 +0000 UTC m=+210.441432335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.432205 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.432276 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.434117 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-54kzj"] Mar 19 00:10:45 crc kubenswrapper[4745]: W0319 00:10:45.440935 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c970483_eb0d_49da_b0bc_d1685f8bf7f1.slice/crio-95ec3f27792a6b83b095a82fbd8b7f3da2cb814dddd8f101ff3a318d87f29868 WatchSource:0}: Error finding container 95ec3f27792a6b83b095a82fbd8b7f3da2cb814dddd8f101ff3a318d87f29868: Status 404 returned error can't find the container with id 95ec3f27792a6b83b095a82fbd8b7f3da2cb814dddd8f101ff3a318d87f29868 Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.443862 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-j2mf5"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.451981 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.457604 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.503432 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.508123 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.008091828 +0000 UTC m=+210.546286969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: W0319 00:10:45.523388 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6b29d2c_9784_4f10_b0d1_09e88ddf5df1.slice/crio-f5c1e2051cd6dd3e4aa42bb48159f89940ea725af562fa476943499e387b22d9 WatchSource:0}: Error finding container f5c1e2051cd6dd3e4aa42bb48159f89940ea725af562fa476943499e387b22d9: Status 404 returned error can't find the container with id f5c1e2051cd6dd3e4aa42bb48159f89940ea725af562fa476943499e387b22d9 Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.610725 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.611292 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.611316 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.111302176 +0000 UTC m=+210.649497307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.611032 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.641576 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m56ls"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.645218 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t28kd"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.703822 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.704566 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.715486 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.715841 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.215827888 +0000 UTC m=+210.754023019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.716691 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.727948 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-v2dhg"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.729564 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.743366 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.778439 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.790996 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.806228 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:45 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:45 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:45 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.806287 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.806355 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.807934 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.817381 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.818587 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.318564711 +0000 UTC m=+210.856759842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: W0319 00:10:45.882472 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75bf4c3d_1ce3_48df_8598_7f72667807c1.slice/crio-6c709a0699c6d486d1e3770457647591e90dfba805ee2d7741ba1d2f696a5121 WatchSource:0}: Error finding container 6c709a0699c6d486d1e3770457647591e90dfba805ee2d7741ba1d2f696a5121: Status 404 returned error can't find the container with id 6c709a0699c6d486d1e3770457647591e90dfba805ee2d7741ba1d2f696a5121 Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.920690 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.921225 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.42118019 +0000 UTC m=+210.959375321 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.921403 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.921952 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.421943526 +0000 UTC m=+210.960138657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: W0319 00:10:45.946555 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bbc2a79_7f59_45c7_93ac_47ec1e2e7d1d.slice/crio-0b805cdd0ad1515a35bec0557002ce429d01a88da1ef42a23c804d561cd1acf9 WatchSource:0}: Error finding container 0b805cdd0ad1515a35bec0557002ce429d01a88da1ef42a23c804d561cd1acf9: Status 404 returned error can't find the container with id 0b805cdd0ad1515a35bec0557002ce429d01a88da1ef42a23c804d561cd1acf9 Mar 19 00:10:45 crc kubenswrapper[4745]: W0319 00:10:45.993005 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b00ac21_bd51_4aff_a8fc_d14d2b930940.slice/crio-2bb865cd814bd8f7f1c2aaf9928c012bd72a69d9dd71dcdfdb83958c788f8fc8 WatchSource:0}: Error finding container 2bb865cd814bd8f7f1c2aaf9928c012bd72a69d9dd71dcdfdb83958c788f8fc8: Status 404 returned error can't find the container with id 2bb865cd814bd8f7f1c2aaf9928c012bd72a69d9dd71dcdfdb83958c788f8fc8 Mar 19 00:10:46 crc kubenswrapper[4745]: W0319 00:10:45.997051 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b8f7226_6033_4b1b_bd4d_7c045b9d60ef.slice/crio-8176046f7a0873476e4d347de0183fe72cabaf04ffb1a02980b1382c32e9c813 WatchSource:0}: Error finding container 8176046f7a0873476e4d347de0183fe72cabaf04ffb1a02980b1382c32e9c813: Status 404 returned error can't find the container with id 8176046f7a0873476e4d347de0183fe72cabaf04ffb1a02980b1382c32e9c813 Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.005844 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36666: no serving certificate available for the kubelet" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.022743 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:46 crc kubenswrapper[4745]: E0319 00:10:46.023050 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.523035304 +0000 UTC m=+211.061230435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.106609 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36678: no serving certificate available for the kubelet" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.124526 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:46 crc kubenswrapper[4745]: E0319 00:10:46.124805 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.624795014 +0000 UTC m=+211.162990145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.207280 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" event={"ID":"ed8d9c15-ca48-46f9-9368-36693ccdbe8b","Type":"ContainerStarted","Data":"a3c320123655c97acc20454e28e5427399334b877aa5a9e39edc365481dccd4d"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.209129 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36690: no serving certificate available for the kubelet" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.227377 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:46 crc kubenswrapper[4745]: E0319 00:10:46.228217 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.728200229 +0000 UTC m=+211.266395350 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.241948 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" event={"ID":"cb0a157b-0f6d-4738-ae67-e29407c2ba8e","Type":"ContainerStarted","Data":"de77cbaf251bea6a340e2d1d37523c239f308ef017f3be27325f8a89133255c2"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.242007 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" event={"ID":"cb0a157b-0f6d-4738-ae67-e29407c2ba8e","Type":"ContainerStarted","Data":"97b3cb068ae934f465f1e3736304152c1974f3e63e6f701fa363428da62c55a9"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.253637 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-54kzj" event={"ID":"b7b46d81-2c53-4021-be8a-f962c576a94c","Type":"ContainerStarted","Data":"9f05658108a21c1cefe2e8fcfb7c619de85dbc0a51cb6af555e69d419aec2db2"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.280259 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m56ls" event={"ID":"10e7df4c-01da-48e6-8d5d-20e788cd4cdf","Type":"ContainerStarted","Data":"9a31e96382b131a9852f2dba890059c373f00b9f7061c9235262a2b22a876a3e"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.297821 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36692: no serving certificate available for the kubelet" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.316043 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" event={"ID":"4394045d-753a-4e2b-8ea5-7087d41481d2","Type":"ContainerStarted","Data":"850635de3db4c88eabd653eb9047ccba37b18499b525639c36aa0c9c9906a67c"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.316116 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" event={"ID":"4394045d-753a-4e2b-8ea5-7087d41481d2","Type":"ContainerStarted","Data":"26cc6c9e72d0535b1f1605a83dd9d618544f1e163201dfa1d9a2581bf0d8dc66"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.317504 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.320318 4745 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pknnh container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.320369 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" podUID="4394045d-753a-4e2b-8ea5-7087d41481d2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.331254 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:46 crc kubenswrapper[4745]: E0319 00:10:46.333633 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.833615721 +0000 UTC m=+211.371810852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.336309 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" event={"ID":"c6b29d2c-9784-4f10-b0d1-09e88ddf5df1","Type":"ContainerStarted","Data":"f5c1e2051cd6dd3e4aa42bb48159f89940ea725af562fa476943499e387b22d9"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.341994 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" event={"ID":"d14d18f5-0177-4458-8ea3-b266cc96d658","Type":"ContainerStarted","Data":"158cd16e86d97bd741b6d9d3a091e473262326dad949142d4f04bb6f64676d4b"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.369895 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" event={"ID":"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474","Type":"ContainerStarted","Data":"9a5c5a294d1c9b09d1221febdb25357b0f9b2f6929ab871225c58a83f52832a8"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.380560 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" event={"ID":"2b8f7226-6033-4b1b-bd4d-7c045b9d60ef","Type":"ContainerStarted","Data":"8176046f7a0873476e4d347de0183fe72cabaf04ffb1a02980b1382c32e9c813"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.385799 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" event={"ID":"11217236-1702-4cd3-b097-e3e410cbbdb4","Type":"ContainerStarted","Data":"f790a779b85926d09bc924a556ff58864248e667e250e8a3faf07d58a1b83da8"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.392031 4745 patch_prober.go:28] interesting pod/apiserver-76f77b778f-zxjjt container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]log ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]etcd ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]poststarthook/generic-apiserver-start-informers ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]poststarthook/max-in-flight-filter ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 19 00:10:46 crc kubenswrapper[4745]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 19 00:10:46 crc kubenswrapper[4745]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 19 00:10:46 crc kubenswrapper[4745]: [+]poststarthook/project.openshift.io-projectcache ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]poststarthook/openshift.io-startinformers ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 19 00:10:46 crc kubenswrapper[4745]: livez check failed Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.392080 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" podUID="46576b1f-4646-44ba-a896-d509b05801cd" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.398181 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" event={"ID":"07e3a371-481b-4e06-a2e9-e12f3ec3d28b","Type":"ContainerStarted","Data":"c365264ac99ee5d7960d6a1d5f9d985536235be84f6ac3405fad83fa0e5b97e3"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.415330 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36698: no serving certificate available for the kubelet" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.425268 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" event={"ID":"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c","Type":"ContainerStarted","Data":"fd66c653a2642d275c3084126cee83ac707cbea8285ab428b1180f5004ed82ea"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.432285 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:46 crc kubenswrapper[4745]: E0319 00:10:46.433698 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.933677225 +0000 UTC m=+211.471872366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.459521 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" event={"ID":"4c46c928-9116-4548-b20e-d9a66d439012","Type":"ContainerStarted","Data":"f827e017aa7525133753ed566b271266d5dbc9f1fe3ddb56d2b7dae84942e48a"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.459564 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" event={"ID":"4c46c928-9116-4548-b20e-d9a66d439012","Type":"ContainerStarted","Data":"f9cd70705364425302817d991fbbb30dbd55eeda27d0d65fcd678e48dfb687db"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.467926 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" event={"ID":"5e57c8e3-d9a0-42bc-98d3-336656039e9c","Type":"ContainerStarted","Data":"53f88b1d56b28535b4c484015019af2e37753d209c01d48cfc93eb31dc1aceb7"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.476335 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" event={"ID":"5c970483-eb0d-49da-b0bc-d1685f8bf7f1","Type":"ContainerStarted","Data":"7f3f236625a286867dfa2fcaf4fee062fb38f1a2b848357310a406d1221d43a2"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.476375 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" event={"ID":"5c970483-eb0d-49da-b0bc-d1685f8bf7f1","Type":"ContainerStarted","Data":"95ec3f27792a6b83b095a82fbd8b7f3da2cb814dddd8f101ff3a318d87f29868"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.480117 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" event={"ID":"0a51c238-7c2c-470f-8123-472327367ec8","Type":"ContainerStarted","Data":"5d0de5e75b92038e268ee9997e6aae27e7a47e5b907366dae742f3b2a496b7ab"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.480158 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" event={"ID":"0a51c238-7c2c-470f-8123-472327367ec8","Type":"ContainerStarted","Data":"a47f19439c050d08f57fd3da8aaa73609242b18663e951037e2ae69cdfa6cc99"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.488470 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" event={"ID":"9b00ac21-bd51-4aff-a8fc-d14d2b930940","Type":"ContainerStarted","Data":"2bb865cd814bd8f7f1c2aaf9928c012bd72a69d9dd71dcdfdb83958c788f8fc8"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.491785 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" event={"ID":"4e2ca532-9e45-44d5-b541-3e9b34352d75","Type":"ContainerStarted","Data":"730a87a80725df33bc6c2202e86ed26d7d609c34a96affbf80d914bfe39df27e"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.492241 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" event={"ID":"4e2ca532-9e45-44d5-b541-3e9b34352d75","Type":"ContainerStarted","Data":"33fe87a1faaaf4734e6c67e461ca817a4f35cd5097926b95a5f6393877824430"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.492840 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.496503 4745 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-wz97d container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.496564 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" podUID="4e2ca532-9e45-44d5-b541-3e9b34352d75" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.508359 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d" event={"ID":"4bf6d6d4-7566-4d1a-acc9-161b6b16f93d","Type":"ContainerStarted","Data":"70a4658f51301554466492b39937e1206d8f92491efa2ebb5066085cf5b04349"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.534055 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:46 crc kubenswrapper[4745]: E0319 00:10:46.536626 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.036611885 +0000 UTC m=+211.574807016 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.548172 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36706: no serving certificate available for the kubelet" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.580199 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" event={"ID":"75bf4c3d-1ce3-48df-8598-7f72667807c1","Type":"ContainerStarted","Data":"6c709a0699c6d486d1e3770457647591e90dfba805ee2d7741ba1d2f696a5121"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.608785 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" event={"ID":"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d","Type":"ContainerStarted","Data":"0b805cdd0ad1515a35bec0557002ce429d01a88da1ef42a23c804d561cd1acf9"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.627172 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t28kd" event={"ID":"ead6910f-478b-45b5-b83c-06d3733635cb","Type":"ContainerStarted","Data":"fd8c5063bf12a3d1e820f817dcd068e4ac562fab3d742fbe681eeab9bb5a2312"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.630544 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-ljtrr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.630586 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ljtrr" podUID="38dd3b53-64de-4201-b427-0b1bc3e51849" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.631275 4745 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qn8c4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.631300 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" podUID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.636302 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:46 crc kubenswrapper[4745]: E0319 00:10:46.636607 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.136590556 +0000 UTC m=+211.674785687 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.654495 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.671422 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36722: no serving certificate available for the kubelet" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.696230 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.738918 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:46 crc kubenswrapper[4745]: E0319 00:10:46.742118 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.242106151 +0000 UTC m=+211.780301282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.790127 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36734: no serving certificate available for the kubelet" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.792959 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:46 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:46 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:46 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.799287 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.842662 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:46 crc kubenswrapper[4745]: E0319 00:10:46.843094 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.343064435 +0000 UTC m=+211.881259566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.943607 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:46 crc kubenswrapper[4745]: E0319 00:10:46.944389 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.444378051 +0000 UTC m=+211.982573182 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.046086 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.047291 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.547261369 +0000 UTC m=+212.085456500 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.047627 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.047949 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.547942062 +0000 UTC m=+212.086137193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.078503 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-m56ls" podStartSLOduration=8.078485186 podStartE2EDuration="8.078485186s" podCreationTimestamp="2026-03-19 00:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.078209437 +0000 UTC m=+211.616404578" watchObservedRunningTime="2026-03-19 00:10:47.078485186 +0000 UTC m=+211.616680317" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.111554 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" podStartSLOduration=173.111533894 podStartE2EDuration="2m53.111533894s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.036663297 +0000 UTC m=+211.574858428" watchObservedRunningTime="2026-03-19 00:10:47.111533894 +0000 UTC m=+211.649729025" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.148198 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.148455 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.64844055 +0000 UTC m=+212.186635671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.186139 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" podStartSLOduration=173.186120481 podStartE2EDuration="2m53.186120481s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.132334404 +0000 UTC m=+211.670529535" watchObservedRunningTime="2026-03-19 00:10:47.186120481 +0000 UTC m=+211.724315612" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.229337 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" podStartSLOduration=173.229321097 podStartE2EDuration="2m53.229321097s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.227875098 +0000 UTC m=+211.766070259" watchObservedRunningTime="2026-03-19 00:10:47.229321097 +0000 UTC m=+211.767516238" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.229609 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" podStartSLOduration=173.229601526 podStartE2EDuration="2m53.229601526s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.188392547 +0000 UTC m=+211.726587678" watchObservedRunningTime="2026-03-19 00:10:47.229601526 +0000 UTC m=+211.767796667" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.255823 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.256267 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.756250691 +0000 UTC m=+212.294445822 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.352805 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" podStartSLOduration=173.352786789 podStartE2EDuration="2m53.352786789s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.304946299 +0000 UTC m=+211.843141450" watchObservedRunningTime="2026-03-19 00:10:47.352786789 +0000 UTC m=+211.890981920" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.357472 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.357797 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.857782494 +0000 UTC m=+212.395977615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.398266 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" podStartSLOduration=173.398250278 podStartE2EDuration="2m53.398250278s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.354476604 +0000 UTC m=+211.892671735" watchObservedRunningTime="2026-03-19 00:10:47.398250278 +0000 UTC m=+211.936445409" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.429571 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.438482 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" podStartSLOduration=173.438464204 podStartE2EDuration="2m53.438464204s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.398403793 +0000 UTC m=+211.936598934" watchObservedRunningTime="2026-03-19 00:10:47.438464204 +0000 UTC m=+211.976659335" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.459338 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.459782 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.959771222 +0000 UTC m=+212.497966353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.490271 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" podStartSLOduration=173.490251395 podStartE2EDuration="2m53.490251395s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.489410107 +0000 UTC m=+212.027605238" watchObservedRunningTime="2026-03-19 00:10:47.490251395 +0000 UTC m=+212.028446526" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.565248 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.565750 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.065735243 +0000 UTC m=+212.603930364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.591344 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36744: no serving certificate available for the kubelet" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.628953 4745 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-522nc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.34:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.629018 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" podUID="d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.34:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.648032 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" event={"ID":"2b8f7226-6033-4b1b-bd4d-7c045b9d60ef","Type":"ContainerStarted","Data":"233cd4d2d7675e58c02b7c7fcd3b8f3f72ecebe92cdb5bd339ef3320b6064b2a"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.648088 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" event={"ID":"2b8f7226-6033-4b1b-bd4d-7c045b9d60ef","Type":"ContainerStarted","Data":"7792c7b67854f5d80159c20538facbcb3e17da47a14c28a612bf193658f9e505"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.648831 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.658699 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" event={"ID":"07e3a371-481b-4e06-a2e9-e12f3ec3d28b","Type":"ContainerStarted","Data":"120c624cd3e05399f1516d9a371ad97a27d4c570451ddeca5e53a8001229af9f"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.658756 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" event={"ID":"07e3a371-481b-4e06-a2e9-e12f3ec3d28b","Type":"ContainerStarted","Data":"d1a48b356cbda6d6d63f06aa8c59df3b1dec71943a2dafd2adfa31ddf8fe0ae0"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.666333 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" event={"ID":"ed8d9c15-ca48-46f9-9368-36693ccdbe8b","Type":"ContainerStarted","Data":"9ffe20c3a8c968c06e01ed2950b65ea7adc4b0b6959b0f17fd566d35bd22f752"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.667154 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.668102 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.168089243 +0000 UTC m=+212.706284374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.684577 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" podStartSLOduration=173.68455915 podStartE2EDuration="2m53.68455915s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.681899581 +0000 UTC m=+212.220094712" watchObservedRunningTime="2026-03-19 00:10:47.68455915 +0000 UTC m=+212.222754281" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.699175 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v4wtx"] Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.699482 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" podUID="16dafcd2-537e-46fe-8028-41bc6ff146a0" containerName="controller-manager" containerID="cri-o://47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863" gracePeriod=30 Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.728464 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t28kd" event={"ID":"ead6910f-478b-45b5-b83c-06d3733635cb","Type":"ContainerStarted","Data":"4fb945a1dd06b0eedfc8c31073b82dc3c078b39b4c594868fd0390f0d5038d03"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.732234 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" podStartSLOduration=173.732217533 podStartE2EDuration="2m53.732217533s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.729072438 +0000 UTC m=+212.267267579" watchObservedRunningTime="2026-03-19 00:10:47.732217533 +0000 UTC m=+212.270412664" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.744120 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" event={"ID":"0a51c238-7c2c-470f-8123-472327367ec8","Type":"ContainerStarted","Data":"9184b12a3f20fa02533e2f845c578c719786e03937f5ec24372bae1d7dbc1c80"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.765354 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj"] Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.765544 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" podUID="555c1cf8-c2b3-4e47-9fa9-314a8672b437" containerName="route-controller-manager" containerID="cri-o://adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1" gracePeriod=30 Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.769264 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.769402 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.269384698 +0000 UTC m=+212.807579829 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.769571 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.771400 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.271386004 +0000 UTC m=+212.809581135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.772075 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-54kzj" event={"ID":"b7b46d81-2c53-4021-be8a-f962c576a94c","Type":"ContainerStarted","Data":"a51cd4499cb77c80bc2273995247be55c1ea12c65b716cf8ee53249eb4b31c8d"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.774905 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d" event={"ID":"4bf6d6d4-7566-4d1a-acc9-161b6b16f93d","Type":"ContainerStarted","Data":"25ee3192903054722ca0b27799a7cdca6ae03628c87139ac7beb46814f2ce3a1"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.774938 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d" event={"ID":"4bf6d6d4-7566-4d1a-acc9-161b6b16f93d","Type":"ContainerStarted","Data":"47cac70244a977c64b07076488a41e24da4931abf01f36a9cb57a257c6591a6d"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.778012 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m56ls" event={"ID":"10e7df4c-01da-48e6-8d5d-20e788cd4cdf","Type":"ContainerStarted","Data":"51b31deae6284da1e0c6544e2eb4fdf3f9ebb6934497896f028fbf5e82ad7df1"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.809849 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" event={"ID":"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d","Type":"ContainerStarted","Data":"1689e54ac6b38bf6f8e51354e2859d39b0a2257632494bd652cf7dcd045d2767"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.810632 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:47 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:47 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:47 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.810674 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.812661 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.819131 4745 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-c2x4m container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.819187 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" podUID="7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.843743 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" event={"ID":"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474","Type":"ContainerStarted","Data":"28132d65e459ce4465574a3af0e9a6857fa66851b653a8259754ea3ada5b1a06"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.864023 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" event={"ID":"11217236-1702-4cd3-b097-e3e410cbbdb4","Type":"ContainerStarted","Data":"37dae2266302bca7574c610df93b0c5d759818fce7d3068c5e513e570657ed24"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.871154 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.872861 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.372843534 +0000 UTC m=+212.911038665 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.875358 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d" podStartSLOduration=173.875339667 podStartE2EDuration="2m53.875339667s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.874085375 +0000 UTC m=+212.412280506" watchObservedRunningTime="2026-03-19 00:10:47.875339667 +0000 UTC m=+212.413534798" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.876244 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" podStartSLOduration=173.876235767 podStartE2EDuration="2m53.876235767s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.801492714 +0000 UTC m=+212.339687845" watchObservedRunningTime="2026-03-19 00:10:47.876235767 +0000 UTC m=+212.414430898" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.883317 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" event={"ID":"75bf4c3d-1ce3-48df-8598-7f72667807c1","Type":"ContainerStarted","Data":"9e7db3c4b8160a045a6441db451fbc03b58d9027bbe08bfa7d59fe62a3ed7321"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.884831 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" event={"ID":"9b00ac21-bd51-4aff-a8fc-d14d2b930940","Type":"ContainerStarted","Data":"72e47f42a61f6f960a5acc35595487c91d5d0582bf390852b826e9713883ccb3"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.914821 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" event={"ID":"c6b29d2c-9784-4f10-b0d1-09e88ddf5df1","Type":"ContainerStarted","Data":"38472aefa56c93474d44ed4ee109273167b40317645d09b3fdee2d2de98038fd"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.914912 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" event={"ID":"c6b29d2c-9784-4f10-b0d1-09e88ddf5df1","Type":"ContainerStarted","Data":"6d43969fa88e73d0ecf4ee37011ecf68c352cb1f985f7c03dfdcc19d11eae93a"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.924422 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-ljtrr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.924509 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ljtrr" podUID="38dd3b53-64de-4201-b427-0b1bc3e51849" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.925559 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" podStartSLOduration=173.925545305 podStartE2EDuration="2m53.925545305s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.924728488 +0000 UTC m=+212.462923629" watchObservedRunningTime="2026-03-19 00:10:47.925545305 +0000 UTC m=+212.463740426" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.927710 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.945800 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.962631 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" podStartSLOduration=173.962610777 podStartE2EDuration="2m53.962610777s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.960953181 +0000 UTC m=+212.499148332" watchObservedRunningTime="2026-03-19 00:10:47.962610777 +0000 UTC m=+212.500805908" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.973973 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.976356 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.476341923 +0000 UTC m=+213.014537054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.002557 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" podStartSLOduration=174.002535273 podStartE2EDuration="2m54.002535273s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.996193322 +0000 UTC m=+212.534388483" watchObservedRunningTime="2026-03-19 00:10:48.002535273 +0000 UTC m=+212.540730404" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.076371 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.077038 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.577020617 +0000 UTC m=+213.115215748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.125413 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" podStartSLOduration=174.125393665 podStartE2EDuration="2m54.125393665s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:48.075696483 +0000 UTC m=+212.613891614" watchObservedRunningTime="2026-03-19 00:10:48.125393665 +0000 UTC m=+212.663588806" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.178484 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.178831 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.678818139 +0000 UTC m=+213.217013270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.279778 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.280528 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.780506737 +0000 UTC m=+213.318701868 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.280599 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.281293 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.781281673 +0000 UTC m=+213.319476804 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.382535 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.382673 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.882649891 +0000 UTC m=+213.420845022 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.383041 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.383356 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.883343594 +0000 UTC m=+213.421538725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.395880 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.455030 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" podStartSLOduration=174.455015494 podStartE2EDuration="2m54.455015494s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:48.151005096 +0000 UTC m=+212.689200227" watchObservedRunningTime="2026-03-19 00:10:48.455015494 +0000 UTC m=+212.993210615" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.490426 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.490753 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.990739011 +0000 UTC m=+213.528934142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.498397 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.595055 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.595399 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.095387658 +0000 UTC m=+213.633582789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.673135 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.695958 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-866gr\" (UniqueName: \"kubernetes.io/projected/555c1cf8-c2b3-4e47-9fa9-314a8672b437-kube-api-access-866gr\") pod \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.696106 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-client-ca\") pod \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.696128 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-config\") pod \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.696174 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/555c1cf8-c2b3-4e47-9fa9-314a8672b437-serving-cert\") pod \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.696266 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.696567 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.196548428 +0000 UTC m=+213.734743559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.697007 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-client-ca" (OuterVolumeSpecName: "client-ca") pod "555c1cf8-c2b3-4e47-9fa9-314a8672b437" (UID: "555c1cf8-c2b3-4e47-9fa9-314a8672b437"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.697171 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-config" (OuterVolumeSpecName: "config") pod "555c1cf8-c2b3-4e47-9fa9-314a8672b437" (UID: "555c1cf8-c2b3-4e47-9fa9-314a8672b437"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.705389 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/555c1cf8-c2b3-4e47-9fa9-314a8672b437-kube-api-access-866gr" (OuterVolumeSpecName: "kube-api-access-866gr") pod "555c1cf8-c2b3-4e47-9fa9-314a8672b437" (UID: "555c1cf8-c2b3-4e47-9fa9-314a8672b437"). InnerVolumeSpecName "kube-api-access-866gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.709509 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555c1cf8-c2b3-4e47-9fa9-314a8672b437-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "555c1cf8-c2b3-4e47-9fa9-314a8672b437" (UID: "555c1cf8-c2b3-4e47-9fa9-314a8672b437"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.790513 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:48 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:48 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:48 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.790758 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.798439 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-proxy-ca-bundles\") pod \"16dafcd2-537e-46fe-8028-41bc6ff146a0\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.798504 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zznl9\" (UniqueName: \"kubernetes.io/projected/16dafcd2-537e-46fe-8028-41bc6ff146a0-kube-api-access-zznl9\") pod \"16dafcd2-537e-46fe-8028-41bc6ff146a0\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.798667 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-client-ca\") pod \"16dafcd2-537e-46fe-8028-41bc6ff146a0\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.798727 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-config\") pod \"16dafcd2-537e-46fe-8028-41bc6ff146a0\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.798768 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dafcd2-537e-46fe-8028-41bc6ff146a0-serving-cert\") pod \"16dafcd2-537e-46fe-8028-41bc6ff146a0\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.799009 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.799061 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/555c1cf8-c2b3-4e47-9fa9-314a8672b437-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.799075 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-866gr\" (UniqueName: \"kubernetes.io/projected/555c1cf8-c2b3-4e47-9fa9-314a8672b437-kube-api-access-866gr\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.799085 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.799093 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.799269 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "16dafcd2-537e-46fe-8028-41bc6ff146a0" (UID: "16dafcd2-537e-46fe-8028-41bc6ff146a0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.799328 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.299316782 +0000 UTC m=+213.837511913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.799521 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-client-ca" (OuterVolumeSpecName: "client-ca") pod "16dafcd2-537e-46fe-8028-41bc6ff146a0" (UID: "16dafcd2-537e-46fe-8028-41bc6ff146a0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.799830 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-config" (OuterVolumeSpecName: "config") pod "16dafcd2-537e-46fe-8028-41bc6ff146a0" (UID: "16dafcd2-537e-46fe-8028-41bc6ff146a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.804545 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16dafcd2-537e-46fe-8028-41bc6ff146a0-kube-api-access-zznl9" (OuterVolumeSpecName: "kube-api-access-zznl9") pod "16dafcd2-537e-46fe-8028-41bc6ff146a0" (UID: "16dafcd2-537e-46fe-8028-41bc6ff146a0"). InnerVolumeSpecName "kube-api-access-zznl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.809333 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16dafcd2-537e-46fe-8028-41bc6ff146a0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "16dafcd2-537e-46fe-8028-41bc6ff146a0" (UID: "16dafcd2-537e-46fe-8028-41bc6ff146a0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.900755 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.901181 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.401148615 +0000 UTC m=+213.939343756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.901284 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.901546 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dafcd2-537e-46fe-8028-41bc6ff146a0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.901565 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.901576 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zznl9\" (UniqueName: \"kubernetes.io/projected/16dafcd2-537e-46fe-8028-41bc6ff146a0-kube-api-access-zznl9\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.901587 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.901598 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.901675 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.401659962 +0000 UTC m=+213.939855093 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.929517 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t28kd" event={"ID":"ead6910f-478b-45b5-b83c-06d3733635cb","Type":"ContainerStarted","Data":"d1fba5ed2707a755db37c09bcae3bafe499cb631e80e4e99585d82257792a8de"} Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.930474 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.934280 4745 generic.go:334] "Generic (PLEG): container finished" podID="16dafcd2-537e-46fe-8028-41bc6ff146a0" containerID="47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863" exitCode=0 Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.934350 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" event={"ID":"16dafcd2-537e-46fe-8028-41bc6ff146a0","Type":"ContainerDied","Data":"47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863"} Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.934380 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" event={"ID":"16dafcd2-537e-46fe-8028-41bc6ff146a0","Type":"ContainerDied","Data":"4ccfdcfde3a6b6c854686aad74a4c94d360f5531e573b9fe69f9444980375eba"} Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.934400 4745 scope.go:117] "RemoveContainer" containerID="47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.934519 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.971994 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" event={"ID":"11217236-1702-4cd3-b097-e3e410cbbdb4","Type":"ContainerStarted","Data":"5c0c2130c98ce029a089c367beee380b3f3abf95cf3be36859beaf586b5b748c"} Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.962556 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36754: no serving certificate available for the kubelet" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.980232 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-t28kd" podStartSLOduration=9.980207871 podStartE2EDuration="9.980207871s" podCreationTimestamp="2026-03-19 00:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:48.964607413 +0000 UTC m=+213.502802564" watchObservedRunningTime="2026-03-19 00:10:48.980207871 +0000 UTC m=+213.518402992" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.992260 4745 generic.go:334] "Generic (PLEG): container finished" podID="555c1cf8-c2b3-4e47-9fa9-314a8672b437" containerID="adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1" exitCode=0 Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.993595 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.994023 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" event={"ID":"555c1cf8-c2b3-4e47-9fa9-314a8672b437","Type":"ContainerDied","Data":"adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1"} Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.994111 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" event={"ID":"555c1cf8-c2b3-4e47-9fa9-314a8672b437","Type":"ContainerDied","Data":"4cb2967d4cad71f05028927d26e36d1cb64a836920dd56873c07f0bfdd7a7214"} Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:48.997626 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q9zn6"] Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.001416 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555c1cf8-c2b3-4e47-9fa9-314a8672b437" containerName="route-controller-manager" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.001505 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="555c1cf8-c2b3-4e47-9fa9-314a8672b437" containerName="route-controller-manager" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.001566 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16dafcd2-537e-46fe-8028-41bc6ff146a0" containerName="controller-manager" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.001640 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="16dafcd2-537e-46fe-8028-41bc6ff146a0" containerName="controller-manager" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.001855 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="555c1cf8-c2b3-4e47-9fa9-314a8672b437" containerName="route-controller-manager" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.005586 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="16dafcd2-537e-46fe-8028-41bc6ff146a0" containerName="controller-manager" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.008193 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.008499 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v4wtx"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.008618 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.009126 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.509108801 +0000 UTC m=+214.047303932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.009851 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.010095 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v4wtx"] Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.010641 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.510628321 +0000 UTC m=+214.048823452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.011914 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.042506 4745 scope.go:117] "RemoveContainer" containerID="47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.045516 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863\": container with ID starting with 47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863 not found: ID does not exist" containerID="47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.045559 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863"} err="failed to get container status \"47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863\": rpc error: code = NotFound desc = could not find container \"47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863\": container with ID starting with 47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863 not found: ID does not exist" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.045583 4745 scope.go:117] "RemoveContainer" containerID="adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.057134 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q9zn6"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.066282 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.120732 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.121136 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdqmk\" (UniqueName: \"kubernetes.io/projected/c21b8175-025a-4d91-ad43-389dbad40846-kube-api-access-rdqmk\") pod \"community-operators-q9zn6\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.121322 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-utilities\") pod \"community-operators-q9zn6\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.121347 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-catalog-content\") pod \"community-operators-q9zn6\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.122507 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.622493087 +0000 UTC m=+214.160688218 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.142038 4745 scope.go:117] "RemoveContainer" containerID="adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.150066 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1\": container with ID starting with adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1 not found: ID does not exist" containerID="adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.150118 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1"} err="failed to get container status \"adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1\": rpc error: code = NotFound desc = could not find container \"adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1\": container with ID starting with adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1 not found: ID does not exist" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.151590 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" podStartSLOduration=175.151575934 podStartE2EDuration="2m55.151575934s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:49.082262462 +0000 UTC m=+213.620457593" watchObservedRunningTime="2026-03-19 00:10:49.151575934 +0000 UTC m=+213.689771065" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.188233 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.205519 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.208641 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hhfzg"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.217184 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.223032 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdqmk\" (UniqueName: \"kubernetes.io/projected/c21b8175-025a-4d91-ad43-389dbad40846-kube-api-access-rdqmk\") pod \"community-operators-q9zn6\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.223076 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.223110 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-utilities\") pod \"community-operators-q9zn6\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.223137 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-catalog-content\") pod \"community-operators-q9zn6\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.224528 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-catalog-content\") pod \"community-operators-q9zn6\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.225080 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.725069986 +0000 UTC m=+214.263265107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.225393 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-utilities\") pod \"community-operators-q9zn6\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.237722 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hhfzg"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.238866 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.259004 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdqmk\" (UniqueName: \"kubernetes.io/projected/c21b8175-025a-4d91-ad43-389dbad40846-kube-api-access-rdqmk\") pod \"community-operators-q9zn6\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.324619 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.325092 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.825060147 +0000 UTC m=+214.363255288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.325223 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-catalog-content\") pod \"certified-operators-hhfzg\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.325283 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82wwh\" (UniqueName: \"kubernetes.io/projected/0c1d22d3-b584-4622-856c-b531a5d1ad5d-kube-api-access-82wwh\") pod \"certified-operators-hhfzg\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.325391 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-utilities\") pod \"certified-operators-hhfzg\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.325449 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.326014 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.826004219 +0000 UTC m=+214.364199350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.366661 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.384968 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kgsn7"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.386336 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.413285 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kgsn7"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.426630 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.426974 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-catalog-content\") pod \"certified-operators-hhfzg\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.427003 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82wwh\" (UniqueName: \"kubernetes.io/projected/0c1d22d3-b584-4622-856c-b531a5d1ad5d-kube-api-access-82wwh\") pod \"certified-operators-hhfzg\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.427056 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-utilities\") pod \"certified-operators-hhfzg\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.427708 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-catalog-content\") pod \"certified-operators-hhfzg\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.427725 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-utilities\") pod \"certified-operators-hhfzg\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.427825 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.92780463 +0000 UTC m=+214.465999821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.460449 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82wwh\" (UniqueName: \"kubernetes.io/projected/0c1d22d3-b584-4622-856c-b531a5d1ad5d-kube-api-access-82wwh\") pod \"certified-operators-hhfzg\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.530203 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.530261 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-catalog-content\") pod \"community-operators-kgsn7\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.530289 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-utilities\") pod \"community-operators-kgsn7\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.530322 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkp49\" (UniqueName: \"kubernetes.io/projected/09d29a41-94df-42b0-b7d3-6b47b06a238f-kube-api-access-rkp49\") pod \"community-operators-kgsn7\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.536235 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.036215152 +0000 UTC m=+214.574410353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.559141 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.578373 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g5dw2"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.579513 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.615646 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g5dw2"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.633467 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.633824 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-catalog-content\") pod \"community-operators-kgsn7\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.633858 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-utilities\") pod \"community-operators-kgsn7\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.633885 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkp49\" (UniqueName: \"kubernetes.io/projected/09d29a41-94df-42b0-b7d3-6b47b06a238f-kube-api-access-rkp49\") pod \"community-operators-kgsn7\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.634604 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-catalog-content\") pod \"community-operators-kgsn7\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.634705 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.134686284 +0000 UTC m=+214.672881415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.634713 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-utilities\") pod \"community-operators-kgsn7\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.664182 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkp49\" (UniqueName: \"kubernetes.io/projected/09d29a41-94df-42b0-b7d3-6b47b06a238f-kube-api-access-rkp49\") pod \"community-operators-kgsn7\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.736064 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.736120 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-catalog-content\") pod \"certified-operators-g5dw2\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.736154 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-695vg\" (UniqueName: \"kubernetes.io/projected/ee0bf814-e571-41fe-9265-b77d8b53e20f-kube-api-access-695vg\") pod \"certified-operators-g5dw2\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.736212 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-utilities\") pod \"certified-operators-g5dw2\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.736730 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.737632 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.237619093 +0000 UTC m=+214.775814224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.789936 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:49 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:49 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:49 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.790287 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.835411 4745 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.838095 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.838554 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-utilities\") pod \"certified-operators-g5dw2\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.838609 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.338565286 +0000 UTC m=+214.876760417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.838903 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.838994 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-catalog-content\") pod \"certified-operators-g5dw2\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.839056 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-695vg\" (UniqueName: \"kubernetes.io/projected/ee0bf814-e571-41fe-9265-b77d8b53e20f-kube-api-access-695vg\") pod \"certified-operators-g5dw2\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.839479 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.339461326 +0000 UTC m=+214.877656457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.840073 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-catalog-content\") pod \"certified-operators-g5dw2\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.841614 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-utilities\") pod \"certified-operators-g5dw2\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.860037 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-695vg\" (UniqueName: \"kubernetes.io/projected/ee0bf814-e571-41fe-9265-b77d8b53e20f-kube-api-access-695vg\") pod \"certified-operators-g5dw2\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.871339 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hhfzg"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.900718 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q9zn6"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.926825 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: W0319 00:10:49.934529 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc21b8175_025a_4d91_ad43_389dbad40846.slice/crio-44402f88db283736632f4638c8a012502a669c0564707bb1cc601815d7a854a9 WatchSource:0}: Error finding container 44402f88db283736632f4638c8a012502a669c0564707bb1cc601815d7a854a9: Status 404 returned error can't find the container with id 44402f88db283736632f4638c8a012502a669c0564707bb1cc601815d7a854a9 Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.939685 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.940014 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.439999935 +0000 UTC m=+214.978195066 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.033714 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9zn6" event={"ID":"c21b8175-025a-4d91-ad43-389dbad40846","Type":"ContainerStarted","Data":"44402f88db283736632f4638c8a012502a669c0564707bb1cc601815d7a854a9"} Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.035478 4745 generic.go:334] "Generic (PLEG): container finished" podID="75bf4c3d-1ce3-48df-8598-7f72667807c1" containerID="9e7db3c4b8160a045a6441db451fbc03b58d9027bbe08bfa7d59fe62a3ed7321" exitCode=0 Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.035742 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" event={"ID":"75bf4c3d-1ce3-48df-8598-7f72667807c1","Type":"ContainerDied","Data":"9e7db3c4b8160a045a6441db451fbc03b58d9027bbe08bfa7d59fe62a3ed7321"} Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.055241 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:50 crc kubenswrapper[4745]: E0319 00:10:50.055818 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.555804663 +0000 UTC m=+215.093999794 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.087352 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-54kzj" event={"ID":"b7b46d81-2c53-4021-be8a-f962c576a94c","Type":"ContainerStarted","Data":"872a2af3b5e410cebc7b95e37c95c913e524a6dae4e0a087b8b1a382f2544e63"} Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.096575 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhfzg" event={"ID":"0c1d22d3-b584-4622-856c-b531a5d1ad5d","Type":"ContainerStarted","Data":"1385db0e9218cd6a53bd844f3c99f4797ac5eed30c3c8451117c54ef69a818d9"} Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.147956 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16dafcd2-537e-46fe-8028-41bc6ff146a0" path="/var/lib/kubelet/pods/16dafcd2-537e-46fe-8028-41bc6ff146a0/volumes" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.152770 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="555c1cf8-c2b3-4e47-9fa9-314a8672b437" path="/var/lib/kubelet/pods/555c1cf8-c2b3-4e47-9fa9-314a8672b437/volumes" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.156276 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:50 crc kubenswrapper[4745]: E0319 00:10:50.156634 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.656606262 +0000 UTC m=+215.194801393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.156711 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:50 crc kubenswrapper[4745]: E0319 00:10:50.157151 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.657137489 +0000 UTC m=+215.195332620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.219800 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g5dw2"] Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.257926 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:50 crc kubenswrapper[4745]: E0319 00:10:50.258475 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.758453505 +0000 UTC m=+215.296648646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.258501 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk"] Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.259486 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.261121 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:50 crc kubenswrapper[4745]: E0319 00:10:50.264278 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.764257788 +0000 UTC m=+215.302453009 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.266211 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.266722 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.266826 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.266985 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.267106 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.276138 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf"] Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.277307 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.282357 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.282651 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.282688 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.282962 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.283021 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.284993 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.285226 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.286432 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.300314 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kgsn7"] Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.300373 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk"] Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.300387 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf"] Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.328353 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 00:10:50 crc kubenswrapper[4745]: W0319 00:10:50.334604 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09d29a41_94df_42b0_b7d3_6b47b06a238f.slice/crio-9abf27759067afc6e0e47fd16a9553d00f8c09a95f77720200901bcf6c2854bb WatchSource:0}: Error finding container 9abf27759067afc6e0e47fd16a9553d00f8c09a95f77720200901bcf6c2854bb: Status 404 returned error can't find the container with id 9abf27759067afc6e0e47fd16a9553d00f8c09a95f77720200901bcf6c2854bb Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.341256 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.342200 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.346480 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.350213 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.350351 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.353355 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.362174 4745 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-19T00:10:49.835440833Z","Handler":null,"Name":""} Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.364132 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.364411 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-client-ca\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.364480 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-config\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: E0319 00:10:50.364691 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.864651033 +0000 UTC m=+215.402846154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.365351 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g62l6\" (UniqueName: \"kubernetes.io/projected/4dac052a-7e93-4343-901a-6b0cfb885cc4-kube-api-access-g62l6\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.365401 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-client-ca\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.365446 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-serving-cert\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.365494 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.365519 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dac052a-7e93-4343-901a-6b0cfb885cc4-serving-cert\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.365545 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsqlz\" (UniqueName: \"kubernetes.io/projected/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-kube-api-access-nsqlz\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.365563 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-proxy-ca-bundles\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.365582 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-config\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: E0319 00:10:50.365917 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.865901314 +0000 UTC m=+215.404096445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.383810 4745 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.383848 4745 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.469416 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.469826 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g62l6\" (UniqueName: \"kubernetes.io/projected/4dac052a-7e93-4343-901a-6b0cfb885cc4-kube-api-access-g62l6\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.469856 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-client-ca\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.469931 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-serving-cert\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.469970 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dac052a-7e93-4343-901a-6b0cfb885cc4-serving-cert\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.469987 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsqlz\" (UniqueName: \"kubernetes.io/projected/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-kube-api-access-nsqlz\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.470001 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-proxy-ca-bundles\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.470020 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-config\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.470042 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6368460d-1bb9-4315-9730-1cf1673361fe-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6368460d-1bb9-4315-9730-1cf1673361fe\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.470094 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-client-ca\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.470113 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6368460d-1bb9-4315-9730-1cf1673361fe-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6368460d-1bb9-4315-9730-1cf1673361fe\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.470150 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-config\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.482259 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-config\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.488088 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.514233 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-client-ca\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.536800 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-client-ca\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.537342 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-proxy-ca-bundles\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.537760 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-serving-cert\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.538224 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-config\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.550378 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g62l6\" (UniqueName: \"kubernetes.io/projected/4dac052a-7e93-4343-901a-6b0cfb885cc4-kube-api-access-g62l6\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.550490 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dac052a-7e93-4343-901a-6b0cfb885cc4-serving-cert\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.575046 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6368460d-1bb9-4315-9730-1cf1673361fe-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6368460d-1bb9-4315-9730-1cf1673361fe\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.575487 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.575523 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6368460d-1bb9-4315-9730-1cf1673361fe-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6368460d-1bb9-4315-9730-1cf1673361fe\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.575966 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6368460d-1bb9-4315-9730-1cf1673361fe-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6368460d-1bb9-4315-9730-1cf1673361fe\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.579061 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsqlz\" (UniqueName: \"kubernetes.io/projected/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-kube-api-access-nsqlz\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.585042 4745 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.585091 4745 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.615304 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.615601 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6368460d-1bb9-4315-9730-1cf1673361fe-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6368460d-1bb9-4315-9730-1cf1673361fe\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.641593 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.699286 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.733612 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.852402 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:50 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:50 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:50 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.852462 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.863611 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.961233 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk"] Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.091909 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.109637 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" event={"ID":"4dac052a-7e93-4343-901a-6b0cfb885cc4","Type":"ContainerStarted","Data":"61b844b715e359e6d46bde055bd1277533b2b0f0ca8e92400f5004133e860078"} Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.114116 4745 generic.go:334] "Generic (PLEG): container finished" podID="c21b8175-025a-4d91-ad43-389dbad40846" containerID="80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0" exitCode=0 Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.114189 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9zn6" event={"ID":"c21b8175-025a-4d91-ad43-389dbad40846","Type":"ContainerDied","Data":"80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0"} Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.118391 4745 generic.go:334] "Generic (PLEG): container finished" podID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerID="3c0dd7e0c251e39bd78fdfc535f458fe29dccacfeda18a0fdd0fe102becb3d5f" exitCode=0 Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.118458 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgsn7" event={"ID":"09d29a41-94df-42b0-b7d3-6b47b06a238f","Type":"ContainerDied","Data":"3c0dd7e0c251e39bd78fdfc535f458fe29dccacfeda18a0fdd0fe102becb3d5f"} Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.118482 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgsn7" event={"ID":"09d29a41-94df-42b0-b7d3-6b47b06a238f","Type":"ContainerStarted","Data":"9abf27759067afc6e0e47fd16a9553d00f8c09a95f77720200901bcf6c2854bb"} Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.144350 4745 generic.go:334] "Generic (PLEG): container finished" podID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerID="aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db" exitCode=0 Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.144971 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhfzg" event={"ID":"0c1d22d3-b584-4622-856c-b531a5d1ad5d","Type":"ContainerDied","Data":"aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db"} Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.161083 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggt62"] Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.163871 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-54kzj" event={"ID":"b7b46d81-2c53-4021-be8a-f962c576a94c","Type":"ContainerStarted","Data":"7448deb36fe99ac300f92c9645b927ee7a1f8f2550139c1903a5fa47745379cf"} Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.169034 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-54kzj" event={"ID":"b7b46d81-2c53-4021-be8a-f962c576a94c","Type":"ContainerStarted","Data":"1b0420993225746b8972ede8e7be35907c73573aa37a33cea54d5d2273ae112d"} Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.182734 4745 generic.go:334] "Generic (PLEG): container finished" podID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerID="737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925" exitCode=0 Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.183043 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g5dw2" event={"ID":"ee0bf814-e571-41fe-9265-b77d8b53e20f","Type":"ContainerDied","Data":"737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925"} Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.183070 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g5dw2" event={"ID":"ee0bf814-e571-41fe-9265-b77d8b53e20f","Type":"ContainerStarted","Data":"5cf5d6c8b2c76c4c80fde1c17a6532692631c387bb1a224bdbe1f73591bc68b3"} Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.195825 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mtjq5"] Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.209185 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtjq5"] Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.209366 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.213464 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.217736 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-54kzj" podStartSLOduration=12.21770309 podStartE2EDuration="12.21770309s" podCreationTimestamp="2026-03-19 00:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:51.213512232 +0000 UTC m=+215.751707383" watchObservedRunningTime="2026-03-19 00:10:51.21770309 +0000 UTC m=+215.755898221" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.236718 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf"] Mar 19 00:10:51 crc kubenswrapper[4745]: W0319 00:10:51.261900 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4be3ad0_fb20_41e9_9aaf_e55d86cdd1bb.slice/crio-2b9a54ac81f1adc06f169354318daf4a47687d40778d5e79511d9060b89e1649 WatchSource:0}: Error finding container 2b9a54ac81f1adc06f169354318daf4a47687d40778d5e79511d9060b89e1649: Status 404 returned error can't find the container with id 2b9a54ac81f1adc06f169354318daf4a47687d40778d5e79511d9060b89e1649 Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.287317 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-catalog-content\") pod \"redhat-marketplace-mtjq5\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.287446 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqj2f\" (UniqueName: \"kubernetes.io/projected/2c3c406d-9994-4629-b585-4d145b1e04aa-kube-api-access-sqj2f\") pod \"redhat-marketplace-mtjq5\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.287524 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-utilities\") pod \"redhat-marketplace-mtjq5\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.368583 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.369584 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.379723 4745 patch_prober.go:28] interesting pod/console-f9d7485db-ssbjs container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.379785 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ssbjs" podUID="053b13b0-078a-45ea-a005-e38aab17b42f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.390436 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-catalog-content\") pod \"redhat-marketplace-mtjq5\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.390515 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqj2f\" (UniqueName: \"kubernetes.io/projected/2c3c406d-9994-4629-b585-4d145b1e04aa-kube-api-access-sqj2f\") pod \"redhat-marketplace-mtjq5\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.390571 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-utilities\") pod \"redhat-marketplace-mtjq5\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.391141 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-utilities\") pod \"redhat-marketplace-mtjq5\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.391208 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-catalog-content\") pod \"redhat-marketplace-mtjq5\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.418410 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqj2f\" (UniqueName: \"kubernetes.io/projected/2c3c406d-9994-4629-b585-4d145b1e04aa-kube-api-access-sqj2f\") pod \"redhat-marketplace-mtjq5\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.474909 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.533108 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.541108 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.582712 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36764: no serving certificate available for the kubelet" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.595423 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75bf4c3d-1ce3-48df-8598-7f72667807c1-config-volume\") pod \"75bf4c3d-1ce3-48df-8598-7f72667807c1\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.595528 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwv7s\" (UniqueName: \"kubernetes.io/projected/75bf4c3d-1ce3-48df-8598-7f72667807c1-kube-api-access-wwv7s\") pod \"75bf4c3d-1ce3-48df-8598-7f72667807c1\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.595684 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75bf4c3d-1ce3-48df-8598-7f72667807c1-secret-volume\") pod \"75bf4c3d-1ce3-48df-8598-7f72667807c1\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.597558 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wf9ss"] Mar 19 00:10:51 crc kubenswrapper[4745]: E0319 00:10:51.597844 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75bf4c3d-1ce3-48df-8598-7f72667807c1" containerName="collect-profiles" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.598166 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="75bf4c3d-1ce3-48df-8598-7f72667807c1" containerName="collect-profiles" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.598404 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="75bf4c3d-1ce3-48df-8598-7f72667807c1" containerName="collect-profiles" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.601846 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.597564 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75bf4c3d-1ce3-48df-8598-7f72667807c1-config-volume" (OuterVolumeSpecName: "config-volume") pod "75bf4c3d-1ce3-48df-8598-7f72667807c1" (UID: "75bf4c3d-1ce3-48df-8598-7f72667807c1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.614069 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75bf4c3d-1ce3-48df-8598-7f72667807c1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "75bf4c3d-1ce3-48df-8598-7f72667807c1" (UID: "75bf4c3d-1ce3-48df-8598-7f72667807c1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.614261 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75bf4c3d-1ce3-48df-8598-7f72667807c1-kube-api-access-wwv7s" (OuterVolumeSpecName: "kube-api-access-wwv7s") pod "75bf4c3d-1ce3-48df-8598-7f72667807c1" (UID: "75bf4c3d-1ce3-48df-8598-7f72667807c1"). InnerVolumeSpecName "kube-api-access-wwv7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.623742 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wf9ss"] Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.686711 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-ljtrr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.686755 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-ljtrr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.686795 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ljtrr" podUID="38dd3b53-64de-4201-b427-0b1bc3e51849" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.686830 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ljtrr" podUID="38dd3b53-64de-4201-b427-0b1bc3e51849" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.697174 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-catalog-content\") pod \"redhat-marketplace-wf9ss\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.697322 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sq28\" (UniqueName: \"kubernetes.io/projected/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-kube-api-access-8sq28\") pod \"redhat-marketplace-wf9ss\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.697364 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-utilities\") pod \"redhat-marketplace-wf9ss\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.697447 4745 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75bf4c3d-1ce3-48df-8598-7f72667807c1-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.697463 4745 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75bf4c3d-1ce3-48df-8598-7f72667807c1-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.697476 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwv7s\" (UniqueName: \"kubernetes.io/projected/75bf4c3d-1ce3-48df-8598-7f72667807c1-kube-api-access-wwv7s\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.788295 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.794857 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:51 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:51 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:51 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.794998 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.808611 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sq28\" (UniqueName: \"kubernetes.io/projected/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-kube-api-access-8sq28\") pod \"redhat-marketplace-wf9ss\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.808666 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-utilities\") pod \"redhat-marketplace-wf9ss\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.808741 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-catalog-content\") pod \"redhat-marketplace-wf9ss\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.809362 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-catalog-content\") pod \"redhat-marketplace-wf9ss\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.809908 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-utilities\") pod \"redhat-marketplace-wf9ss\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.834472 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sq28\" (UniqueName: \"kubernetes.io/projected/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-kube-api-access-8sq28\") pod \"redhat-marketplace-wf9ss\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.933816 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.009685 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtjq5"] Mar 19 00:10:52 crc kubenswrapper[4745]: W0319 00:10:52.031843 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c3c406d_9994_4629_b585_4d145b1e04aa.slice/crio-3b83c823529db11222e49bc92d4be77a23b45eab02e962cb6761c2af219fd176 WatchSource:0}: Error finding container 3b83c823529db11222e49bc92d4be77a23b45eab02e962cb6761c2af219fd176: Status 404 returned error can't find the container with id 3b83c823529db11222e49bc92d4be77a23b45eab02e962cb6761c2af219fd176 Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.165257 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.214318 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-75vmv"] Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.231659 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.256367 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-75vmv"] Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.263838 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.319251 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-catalog-content\") pod \"redhat-operators-75vmv\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.319398 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rttcj\" (UniqueName: \"kubernetes.io/projected/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-kube-api-access-rttcj\") pod \"redhat-operators-75vmv\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.319481 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-utilities\") pod \"redhat-operators-75vmv\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.348538 4745 generic.go:334] "Generic (PLEG): container finished" podID="6368460d-1bb9-4315-9730-1cf1673361fe" containerID="5b74ce15b598bc3286b90032d7a9c3ca2e9d7505c2ee3febadcd909eeca62c01" exitCode=0 Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.348662 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6368460d-1bb9-4315-9730-1cf1673361fe","Type":"ContainerDied","Data":"5b74ce15b598bc3286b90032d7a9c3ca2e9d7505c2ee3febadcd909eeca62c01"} Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.348702 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6368460d-1bb9-4315-9730-1cf1673361fe","Type":"ContainerStarted","Data":"83e03c767a24ab1198cfb801009d43bc183be0d17f5e9bd4c865466aea4b9acd"} Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.368901 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" event={"ID":"b246ac53-9c42-426b-97da-3ca4075766ab","Type":"ContainerStarted","Data":"81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0"} Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.368945 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" event={"ID":"b246ac53-9c42-426b-97da-3ca4075766ab","Type":"ContainerStarted","Data":"de743d9871d9e519d057ac18763fdb10aeceb0154e79a293ccaea85445d780d3"} Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.369908 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.420507 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-catalog-content\") pod \"redhat-operators-75vmv\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.420553 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rttcj\" (UniqueName: \"kubernetes.io/projected/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-kube-api-access-rttcj\") pod \"redhat-operators-75vmv\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.420629 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-utilities\") pod \"redhat-operators-75vmv\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.421480 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-utilities\") pod \"redhat-operators-75vmv\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.421789 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-catalog-content\") pod \"redhat-operators-75vmv\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.425536 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" event={"ID":"75bf4c3d-1ce3-48df-8598-7f72667807c1","Type":"ContainerDied","Data":"6c709a0699c6d486d1e3770457647591e90dfba805ee2d7741ba1d2f696a5121"} Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.425587 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c709a0699c6d486d1e3770457647591e90dfba805ee2d7741ba1d2f696a5121" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.425671 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.477479 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rttcj\" (UniqueName: \"kubernetes.io/projected/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-kube-api-access-rttcj\") pod \"redhat-operators-75vmv\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.490088 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" event={"ID":"4dac052a-7e93-4343-901a-6b0cfb885cc4","Type":"ContainerStarted","Data":"6541d510764cf79da71cf58a66703eed8f6a428fb89fbea76cd701f41b751e9a"} Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.490501 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.496324 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.509514 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtjq5" event={"ID":"2c3c406d-9994-4629-b585-4d145b1e04aa","Type":"ContainerStarted","Data":"3b83c823529db11222e49bc92d4be77a23b45eab02e962cb6761c2af219fd176"} Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.538553 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" podStartSLOduration=178.538531238 podStartE2EDuration="2m58.538531238s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:52.436004772 +0000 UTC m=+216.974199903" watchObservedRunningTime="2026-03-19 00:10:52.538531238 +0000 UTC m=+217.076726369" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.539036 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" podStartSLOduration=4.539019764 podStartE2EDuration="4.539019764s" podCreationTimestamp="2026-03-19 00:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:52.537477373 +0000 UTC m=+217.075672524" watchObservedRunningTime="2026-03-19 00:10:52.539019764 +0000 UTC m=+217.077214905" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.577411 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" event={"ID":"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb","Type":"ContainerStarted","Data":"b7db85026fab40c5726ccf8e6eceef62a6eb94ba00083b9af9f1d5aa94f1456d"} Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.577546 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" event={"ID":"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb","Type":"ContainerStarted","Data":"2b9a54ac81f1adc06f169354318daf4a47687d40778d5e79511d9060b89e1649"} Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.581806 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cgghw"] Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.582682 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.586643 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.655503 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cgghw"] Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.658399 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" podStartSLOduration=4.658376479 podStartE2EDuration="4.658376479s" podCreationTimestamp="2026-03-19 00:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:52.623029905 +0000 UTC m=+217.161225066" watchObservedRunningTime="2026-03-19 00:10:52.658376479 +0000 UTC m=+217.196571610" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.725618 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdzp9\" (UniqueName: \"kubernetes.io/projected/b19d4fad-672f-40f3-bfdb-53b36da06399-kube-api-access-mdzp9\") pod \"redhat-operators-cgghw\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.725973 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-utilities\") pod \"redhat-operators-cgghw\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.726146 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-catalog-content\") pod \"redhat-operators-cgghw\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.760928 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wf9ss"] Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.801240 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:52 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:52 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:52 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.801293 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:52 crc kubenswrapper[4745]: W0319 00:10:52.807036 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04cc89b5_7bac_4b91_bb97_a1f5ab14260c.slice/crio-9d959c62c69d3a59b0e990d8806f87c5f90630c2ae5bf2656f164c9a4cd2a4ac WatchSource:0}: Error finding container 9d959c62c69d3a59b0e990d8806f87c5f90630c2ae5bf2656f164c9a4cd2a4ac: Status 404 returned error can't find the container with id 9d959c62c69d3a59b0e990d8806f87c5f90630c2ae5bf2656f164c9a4cd2a4ac Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.828235 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-catalog-content\") pod \"redhat-operators-cgghw\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.828312 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdzp9\" (UniqueName: \"kubernetes.io/projected/b19d4fad-672f-40f3-bfdb-53b36da06399-kube-api-access-mdzp9\") pod \"redhat-operators-cgghw\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.828376 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-utilities\") pod \"redhat-operators-cgghw\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.828911 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-catalog-content\") pod \"redhat-operators-cgghw\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.828935 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-utilities\") pod \"redhat-operators-cgghw\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.852207 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdzp9\" (UniqueName: \"kubernetes.io/projected/b19d4fad-672f-40f3-bfdb-53b36da06399-kube-api-access-mdzp9\") pod \"redhat-operators-cgghw\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.058783 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.077520 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36776: no serving certificate available for the kubelet" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.136938 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.137020 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.137153 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.137180 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.138130 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.141730 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.145035 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.145262 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.145692 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.161762 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.184971 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-75vmv"] Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.354782 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.365942 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.379677 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.392873 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.610561 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cgghw"] Mar 19 00:10:53 crc kubenswrapper[4745]: W0319 00:10:53.620744 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb19d4fad_672f_40f3_bfdb_53b36da06399.slice/crio-a6f79e4dda71ca031bc419e937553d73dcdca6b9aad0060adce374c4237e4880 WatchSource:0}: Error finding container a6f79e4dda71ca031bc419e937553d73dcdca6b9aad0060adce374c4237e4880: Status 404 returned error can't find the container with id a6f79e4dda71ca031bc419e937553d73dcdca6b9aad0060adce374c4237e4880 Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.643437 4745 generic.go:334] "Generic (PLEG): container finished" podID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerID="6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa" exitCode=0 Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.643590 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wf9ss" event={"ID":"04cc89b5-7bac-4b91-bb97-a1f5ab14260c","Type":"ContainerDied","Data":"6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa"} Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.643642 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wf9ss" event={"ID":"04cc89b5-7bac-4b91-bb97-a1f5ab14260c","Type":"ContainerStarted","Data":"9d959c62c69d3a59b0e990d8806f87c5f90630c2ae5bf2656f164c9a4cd2a4ac"} Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.652921 4745 generic.go:334] "Generic (PLEG): container finished" podID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerID="595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1" exitCode=0 Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.653068 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtjq5" event={"ID":"2c3c406d-9994-4629-b585-4d145b1e04aa","Type":"ContainerDied","Data":"595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1"} Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.673955 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75vmv" event={"ID":"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0","Type":"ContainerStarted","Data":"478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4"} Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.675085 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.675105 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75vmv" event={"ID":"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0","Type":"ContainerStarted","Data":"2cd3908399145e2519a565664cfaff071ac8bf459660c66c4bd6a1d4b7d2532a"} Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.689619 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.747371 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.748376 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.750282 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.763773 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.772654 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.792024 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:53 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:53 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:53 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.792087 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.873468 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10ad2429-ed3b-4688-8ee9-361a2ea56579-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"10ad2429-ed3b-4688-8ee9-361a2ea56579\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.873559 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/10ad2429-ed3b-4688-8ee9-361a2ea56579-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"10ad2429-ed3b-4688-8ee9-361a2ea56579\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.974780 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/10ad2429-ed3b-4688-8ee9-361a2ea56579-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"10ad2429-ed3b-4688-8ee9-361a2ea56579\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.974880 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10ad2429-ed3b-4688-8ee9-361a2ea56579-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"10ad2429-ed3b-4688-8ee9-361a2ea56579\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.974994 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/10ad2429-ed3b-4688-8ee9-361a2ea56579-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"10ad2429-ed3b-4688-8ee9-361a2ea56579\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.012156 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10ad2429-ed3b-4688-8ee9-361a2ea56579-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"10ad2429-ed3b-4688-8ee9-361a2ea56579\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.117969 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.182748 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4r5k5"] Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.282058 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 00:10:54 crc kubenswrapper[4745]: W0319 00:10:54.361477 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-5c58eab5939e49bc5a07ab38cb0867da191941123050c5f66713acc92d6977be WatchSource:0}: Error finding container 5c58eab5939e49bc5a07ab38cb0867da191941123050c5f66713acc92d6977be: Status 404 returned error can't find the container with id 5c58eab5939e49bc5a07ab38cb0867da191941123050c5f66713acc92d6977be Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.381567 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6368460d-1bb9-4315-9730-1cf1673361fe-kube-api-access\") pod \"6368460d-1bb9-4315-9730-1cf1673361fe\" (UID: \"6368460d-1bb9-4315-9730-1cf1673361fe\") " Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.381648 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6368460d-1bb9-4315-9730-1cf1673361fe-kubelet-dir\") pod \"6368460d-1bb9-4315-9730-1cf1673361fe\" (UID: \"6368460d-1bb9-4315-9730-1cf1673361fe\") " Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.382084 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6368460d-1bb9-4315-9730-1cf1673361fe-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6368460d-1bb9-4315-9730-1cf1673361fe" (UID: "6368460d-1bb9-4315-9730-1cf1673361fe"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.406136 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6368460d-1bb9-4315-9730-1cf1673361fe-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6368460d-1bb9-4315-9730-1cf1673361fe" (UID: "6368460d-1bb9-4315-9730-1cf1673361fe"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.484019 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6368460d-1bb9-4315-9730-1cf1673361fe-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.484074 4745 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6368460d-1bb9-4315-9730-1cf1673361fe-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.773601 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5c58eab5939e49bc5a07ab38cb0867da191941123050c5f66713acc92d6977be"} Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.782934 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ace90e2b52fc2f5e41ef9ec72047f4210de51cf80ccfd73296dd54a80d3ad814"} Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.791772 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:54 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:54 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:54 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.791936 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.808513 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6368460d-1bb9-4315-9730-1cf1673361fe","Type":"ContainerDied","Data":"83e03c767a24ab1198cfb801009d43bc183be0d17f5e9bd4c865466aea4b9acd"} Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.808566 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83e03c767a24ab1198cfb801009d43bc183be0d17f5e9bd4c865466aea4b9acd" Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.808672 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.834325 4745 generic.go:334] "Generic (PLEG): container finished" podID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerID="478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4" exitCode=0 Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.834473 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75vmv" event={"ID":"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0","Type":"ContainerDied","Data":"478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4"} Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.859060 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" event={"ID":"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7","Type":"ContainerStarted","Data":"ed2fc26656f61bb585323ed6fc040e749d0c185124a74d28a5d612ff3a8ded9c"} Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.867628 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"af81edb17dda05bfeff969a5069a4a4f9214d26fc97643a708c50e839c9089fd"} Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.877754 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgghw" event={"ID":"b19d4fad-672f-40f3-bfdb-53b36da06399","Type":"ContainerStarted","Data":"eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6"} Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.878103 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgghw" event={"ID":"b19d4fad-672f-40f3-bfdb-53b36da06399","Type":"ContainerStarted","Data":"a6f79e4dda71ca031bc419e937553d73dcdca6b9aad0060adce374c4237e4880"} Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.154306 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 00:10:55 crc kubenswrapper[4745]: W0319 00:10:55.194569 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod10ad2429_ed3b_4688_8ee9_361a2ea56579.slice/crio-142ff4f106a756ab212d227b3fb4b768b296da3b66956a4bcc02aa0706b27646 WatchSource:0}: Error finding container 142ff4f106a756ab212d227b3fb4b768b296da3b66956a4bcc02aa0706b27646: Status 404 returned error can't find the container with id 142ff4f106a756ab212d227b3fb4b768b296da3b66956a4bcc02aa0706b27646 Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.803096 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:55 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:55 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:55 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.803705 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.896201 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" event={"ID":"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7","Type":"ContainerStarted","Data":"5bba45c36d6c7e4eeb534a2f79a4a410cf1d990596123b87933c698abdf5ab44"} Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.905785 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ef991f8cac41319612cb2cf2e0b4460c10fc4ce79aa2c3c97624c2bf620b9b17"} Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.906991 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.909726 4745 generic.go:334] "Generic (PLEG): container finished" podID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerID="eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6" exitCode=0 Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.909799 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgghw" event={"ID":"b19d4fad-672f-40f3-bfdb-53b36da06399","Type":"ContainerDied","Data":"eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6"} Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.916014 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0d351bba75fabd782d641353de41feb49f9783105b3fb6ab72b2760acb6d10d1"} Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.917593 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"10ad2429-ed3b-4688-8ee9-361a2ea56579","Type":"ContainerStarted","Data":"142ff4f106a756ab212d227b3fb4b768b296da3b66956a4bcc02aa0706b27646"} Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.921642 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"71f012d5eab7f1600b917d7ba192a81649e4aaa214f494b6d6d830b83eec1ab3"} Mar 19 00:10:56 crc kubenswrapper[4745]: I0319 00:10:56.731644 4745 ???:1] "http: TLS handshake error from 192.168.126.11:47450: no serving certificate available for the kubelet" Mar 19 00:10:56 crc kubenswrapper[4745]: I0319 00:10:56.796858 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:56 crc kubenswrapper[4745]: I0319 00:10:56.800909 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:56 crc kubenswrapper[4745]: I0319 00:10:56.941575 4745 generic.go:334] "Generic (PLEG): container finished" podID="10ad2429-ed3b-4688-8ee9-361a2ea56579" containerID="62c1b52c2f117491bd97c470bf6b3b0a2a679c1e20ac5a146452b907ba153571" exitCode=0 Mar 19 00:10:56 crc kubenswrapper[4745]: I0319 00:10:56.941752 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"10ad2429-ed3b-4688-8ee9-361a2ea56579","Type":"ContainerDied","Data":"62c1b52c2f117491bd97c470bf6b3b0a2a679c1e20ac5a146452b907ba153571"} Mar 19 00:10:57 crc kubenswrapper[4745]: I0319 00:10:57.325476 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:58 crc kubenswrapper[4745]: I0319 00:10:58.039549 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" event={"ID":"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7","Type":"ContainerStarted","Data":"d0e57813f5b03e4ca579c832c1ba05dc0c0e653d2d1c5d9ead50bb120b0518a1"} Mar 19 00:10:58 crc kubenswrapper[4745]: I0319 00:10:58.057089 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4r5k5" podStartSLOduration=184.057065514 podStartE2EDuration="3m4.057065514s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:58.055029966 +0000 UTC m=+222.593225097" watchObservedRunningTime="2026-03-19 00:10:58.057065514 +0000 UTC m=+222.595260645" Mar 19 00:10:58 crc kubenswrapper[4745]: I0319 00:10:58.568985 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 00:10:58 crc kubenswrapper[4745]: I0319 00:10:58.592052 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10ad2429-ed3b-4688-8ee9-361a2ea56579-kube-api-access\") pod \"10ad2429-ed3b-4688-8ee9-361a2ea56579\" (UID: \"10ad2429-ed3b-4688-8ee9-361a2ea56579\") " Mar 19 00:10:58 crc kubenswrapper[4745]: I0319 00:10:58.592208 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/10ad2429-ed3b-4688-8ee9-361a2ea56579-kubelet-dir\") pod \"10ad2429-ed3b-4688-8ee9-361a2ea56579\" (UID: \"10ad2429-ed3b-4688-8ee9-361a2ea56579\") " Mar 19 00:10:58 crc kubenswrapper[4745]: I0319 00:10:58.592588 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/10ad2429-ed3b-4688-8ee9-361a2ea56579-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "10ad2429-ed3b-4688-8ee9-361a2ea56579" (UID: "10ad2429-ed3b-4688-8ee9-361a2ea56579"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:10:58 crc kubenswrapper[4745]: I0319 00:10:58.602196 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10ad2429-ed3b-4688-8ee9-361a2ea56579-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "10ad2429-ed3b-4688-8ee9-361a2ea56579" (UID: "10ad2429-ed3b-4688-8ee9-361a2ea56579"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:10:58 crc kubenswrapper[4745]: I0319 00:10:58.697648 4745 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/10ad2429-ed3b-4688-8ee9-361a2ea56579-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:58 crc kubenswrapper[4745]: I0319 00:10:58.698240 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10ad2429-ed3b-4688-8ee9-361a2ea56579-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:59 crc kubenswrapper[4745]: I0319 00:10:59.082431 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 00:10:59 crc kubenswrapper[4745]: I0319 00:10:59.082423 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"10ad2429-ed3b-4688-8ee9-361a2ea56579","Type":"ContainerDied","Data":"142ff4f106a756ab212d227b3fb4b768b296da3b66956a4bcc02aa0706b27646"} Mar 19 00:10:59 crc kubenswrapper[4745]: I0319 00:10:59.082508 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="142ff4f106a756ab212d227b3fb4b768b296da3b66956a4bcc02aa0706b27646" Mar 19 00:11:01 crc kubenswrapper[4745]: I0319 00:11:01.374478 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:11:01 crc kubenswrapper[4745]: I0319 00:11:01.380336 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:11:01 crc kubenswrapper[4745]: I0319 00:11:01.691205 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-ljtrr" Mar 19 00:11:06 crc kubenswrapper[4745]: I0319 00:11:06.941023 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf"] Mar 19 00:11:06 crc kubenswrapper[4745]: I0319 00:11:06.941766 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" podUID="b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" containerName="controller-manager" containerID="cri-o://b7db85026fab40c5726ccf8e6eceef62a6eb94ba00083b9af9f1d5aa94f1456d" gracePeriod=30 Mar 19 00:11:06 crc kubenswrapper[4745]: I0319 00:11:06.960038 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk"] Mar 19 00:11:06 crc kubenswrapper[4745]: I0319 00:11:06.960249 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" podUID="4dac052a-7e93-4343-901a-6b0cfb885cc4" containerName="route-controller-manager" containerID="cri-o://6541d510764cf79da71cf58a66703eed8f6a428fb89fbea76cd701f41b751e9a" gracePeriod=30 Mar 19 00:11:08 crc kubenswrapper[4745]: I0319 00:11:08.165958 4745 generic.go:334] "Generic (PLEG): container finished" podID="4dac052a-7e93-4343-901a-6b0cfb885cc4" containerID="6541d510764cf79da71cf58a66703eed8f6a428fb89fbea76cd701f41b751e9a" exitCode=0 Mar 19 00:11:08 crc kubenswrapper[4745]: I0319 00:11:08.166047 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" event={"ID":"4dac052a-7e93-4343-901a-6b0cfb885cc4","Type":"ContainerDied","Data":"6541d510764cf79da71cf58a66703eed8f6a428fb89fbea76cd701f41b751e9a"} Mar 19 00:11:08 crc kubenswrapper[4745]: I0319 00:11:08.168414 4745 generic.go:334] "Generic (PLEG): container finished" podID="b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" containerID="b7db85026fab40c5726ccf8e6eceef62a6eb94ba00083b9af9f1d5aa94f1456d" exitCode=0 Mar 19 00:11:08 crc kubenswrapper[4745]: I0319 00:11:08.168445 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" event={"ID":"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb","Type":"ContainerDied","Data":"b7db85026fab40c5726ccf8e6eceef62a6eb94ba00083b9af9f1d5aa94f1456d"} Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.513262 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.549123 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw"] Mar 19 00:11:09 crc kubenswrapper[4745]: E0319 00:11:09.549513 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6368460d-1bb9-4315-9730-1cf1673361fe" containerName="pruner" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.549540 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="6368460d-1bb9-4315-9730-1cf1673361fe" containerName="pruner" Mar 19 00:11:09 crc kubenswrapper[4745]: E0319 00:11:09.549552 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10ad2429-ed3b-4688-8ee9-361a2ea56579" containerName="pruner" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.549566 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="10ad2429-ed3b-4688-8ee9-361a2ea56579" containerName="pruner" Mar 19 00:11:09 crc kubenswrapper[4745]: E0319 00:11:09.549582 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" containerName="controller-manager" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.549593 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" containerName="controller-manager" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.549797 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="10ad2429-ed3b-4688-8ee9-361a2ea56579" containerName="pruner" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.549821 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" containerName="controller-manager" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.549835 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="6368460d-1bb9-4315-9730-1cf1673361fe" containerName="pruner" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.550598 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.553706 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw"] Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.570372 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-client-ca\") pod \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.570457 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsqlz\" (UniqueName: \"kubernetes.io/projected/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-kube-api-access-nsqlz\") pod \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.570538 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-serving-cert\") pod \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.570562 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-proxy-ca-bundles\") pod \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.570629 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-config\") pod \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.572018 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-config" (OuterVolumeSpecName: "config") pod "b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" (UID: "b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.572536 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" (UID: "b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.572905 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-client-ca" (OuterVolumeSpecName: "client-ca") pod "b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" (UID: "b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.581589 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-kube-api-access-nsqlz" (OuterVolumeSpecName: "kube-api-access-nsqlz") pod "b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" (UID: "b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb"). InnerVolumeSpecName "kube-api-access-nsqlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.581606 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" (UID: "b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.671898 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-client-ca\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.672224 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98407a3b-8601-4632-b5b0-9308cfe2dbb6-serving-cert\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.672322 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-proxy-ca-bundles\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.672538 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-config\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.672675 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp7vc\" (UniqueName: \"kubernetes.io/projected/98407a3b-8601-4632-b5b0-9308cfe2dbb6-kube-api-access-xp7vc\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.672961 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.673019 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsqlz\" (UniqueName: \"kubernetes.io/projected/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-kube-api-access-nsqlz\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.673046 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.673069 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.673092 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.773891 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp7vc\" (UniqueName: \"kubernetes.io/projected/98407a3b-8601-4632-b5b0-9308cfe2dbb6-kube-api-access-xp7vc\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.773947 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-client-ca\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.774037 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98407a3b-8601-4632-b5b0-9308cfe2dbb6-serving-cert\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.774066 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-proxy-ca-bundles\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.774097 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-config\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.775616 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-client-ca\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.775901 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-config\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.776495 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-proxy-ca-bundles\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.782740 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98407a3b-8601-4632-b5b0-9308cfe2dbb6-serving-cert\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.789806 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp7vc\" (UniqueName: \"kubernetes.io/projected/98407a3b-8601-4632-b5b0-9308cfe2dbb6-kube-api-access-xp7vc\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.874819 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:10 crc kubenswrapper[4745]: I0319 00:11:10.179132 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" event={"ID":"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb","Type":"ContainerDied","Data":"2b9a54ac81f1adc06f169354318daf4a47687d40778d5e79511d9060b89e1649"} Mar 19 00:11:10 crc kubenswrapper[4745]: I0319 00:11:10.179194 4745 scope.go:117] "RemoveContainer" containerID="b7db85026fab40c5726ccf8e6eceef62a6eb94ba00083b9af9f1d5aa94f1456d" Mar 19 00:11:10 crc kubenswrapper[4745]: I0319 00:11:10.179216 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:11:10 crc kubenswrapper[4745]: I0319 00:11:10.195094 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf"] Mar 19 00:11:10 crc kubenswrapper[4745]: I0319 00:11:10.197736 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf"] Mar 19 00:11:10 crc kubenswrapper[4745]: I0319 00:11:10.616556 4745 patch_prober.go:28] interesting pod/route-controller-manager-5d896c4bf4-rssdk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Mar 19 00:11:10 crc kubenswrapper[4745]: I0319 00:11:10.616624 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" podUID="4dac052a-7e93-4343-901a-6b0cfb885cc4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Mar 19 00:11:10 crc kubenswrapper[4745]: I0319 00:11:10.869334 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:11:12 crc kubenswrapper[4745]: I0319 00:11:12.145655 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" path="/var/lib/kubelet/pods/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb/volumes" Mar 19 00:11:15 crc kubenswrapper[4745]: I0319 00:11:15.606052 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:11:15 crc kubenswrapper[4745]: I0319 00:11:15.606346 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:11:17 crc kubenswrapper[4745]: I0319 00:11:17.238426 4745 ???:1] "http: TLS handshake error from 192.168.126.11:47750: no serving certificate available for the kubelet" Mar 19 00:11:18 crc kubenswrapper[4745]: I0319 00:11:18.220350 4745 generic.go:334] "Generic (PLEG): container finished" podID="88004414-de81-4e3c-9f3f-99f90a3bbc98" containerID="f0c4e6e413094127a3df81fb923d127a9cc66b65e1e4d1cf289b3133f8a3d81a" exitCode=0 Mar 19 00:11:18 crc kubenswrapper[4745]: I0319 00:11:18.220426 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29564640-xrq9h" event={"ID":"88004414-de81-4e3c-9f3f-99f90a3bbc98","Type":"ContainerDied","Data":"f0c4e6e413094127a3df81fb923d127a9cc66b65e1e4d1cf289b3133f8a3d81a"} Mar 19 00:11:21 crc kubenswrapper[4745]: I0319 00:11:21.617206 4745 patch_prober.go:28] interesting pod/route-controller-manager-5d896c4bf4-rssdk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 00:11:21 crc kubenswrapper[4745]: I0319 00:11:21.617288 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" podUID="4dac052a-7e93-4343-901a-6b0cfb885cc4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.233803 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.328655 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.330379 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.334595 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.335173 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.335300 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.457175 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5698031f-9dc1-4457-a866-2fd312ebfa9e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5698031f-9dc1-4457-a866-2fd312ebfa9e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.457538 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5698031f-9dc1-4457-a866-2fd312ebfa9e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5698031f-9dc1-4457-a866-2fd312ebfa9e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.559240 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5698031f-9dc1-4457-a866-2fd312ebfa9e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5698031f-9dc1-4457-a866-2fd312ebfa9e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.559541 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5698031f-9dc1-4457-a866-2fd312ebfa9e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5698031f-9dc1-4457-a866-2fd312ebfa9e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.559355 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5698031f-9dc1-4457-a866-2fd312ebfa9e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5698031f-9dc1-4457-a866-2fd312ebfa9e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.596504 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5698031f-9dc1-4457-a866-2fd312ebfa9e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5698031f-9dc1-4457-a866-2fd312ebfa9e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 00:11:22 crc kubenswrapper[4745]: E0319 00:11:22.609741 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 19 00:11:22 crc kubenswrapper[4745]: E0319 00:11:22.609920 4745 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 00:11:22 crc kubenswrapper[4745]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 19 00:11:22 crc kubenswrapper[4745]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c5kr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29564650-7k6ld_openshift-infra(d14d18f5-0177-4458-8ea3-b266cc96d658): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 19 00:11:22 crc kubenswrapper[4745]: > logger="UnhandledError" Mar 19 00:11:22 crc kubenswrapper[4745]: E0319 00:11:22.611069 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" podUID="d14d18f5-0177-4458-8ea3-b266cc96d658" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.651202 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.207285 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29564640-xrq9h" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.216741 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.253069 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" event={"ID":"4dac052a-7e93-4343-901a-6b0cfb885cc4","Type":"ContainerDied","Data":"61b844b715e359e6d46bde055bd1277533b2b0f0ca8e92400f5004133e860078"} Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.253331 4745 scope.go:117] "RemoveContainer" containerID="6541d510764cf79da71cf58a66703eed8f6a428fb89fbea76cd701f41b751e9a" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.253416 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.260727 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29564640-xrq9h" event={"ID":"88004414-de81-4e3c-9f3f-99f90a3bbc98","Type":"ContainerDied","Data":"f164174bebded768a6d9e6df2a3ae9216824193f2bf4ab662809fd13bd0bfb0d"} Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.260750 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29564640-xrq9h" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.260767 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f164174bebded768a6d9e6df2a3ae9216824193f2bf4ab662809fd13bd0bfb0d" Mar 19 00:11:23 crc kubenswrapper[4745]: E0319 00:11:23.261920 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" podUID="d14d18f5-0177-4458-8ea3-b266cc96d658" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.267559 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/88004414-de81-4e3c-9f3f-99f90a3bbc98-serviceca\") pod \"88004414-de81-4e3c-9f3f-99f90a3bbc98\" (UID: \"88004414-de81-4e3c-9f3f-99f90a3bbc98\") " Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.267633 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g62l6\" (UniqueName: \"kubernetes.io/projected/4dac052a-7e93-4343-901a-6b0cfb885cc4-kube-api-access-g62l6\") pod \"4dac052a-7e93-4343-901a-6b0cfb885cc4\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.267737 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-config\") pod \"4dac052a-7e93-4343-901a-6b0cfb885cc4\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.267756 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnckx\" (UniqueName: \"kubernetes.io/projected/88004414-de81-4e3c-9f3f-99f90a3bbc98-kube-api-access-lnckx\") pod \"88004414-de81-4e3c-9f3f-99f90a3bbc98\" (UID: \"88004414-de81-4e3c-9f3f-99f90a3bbc98\") " Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.267780 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-client-ca\") pod \"4dac052a-7e93-4343-901a-6b0cfb885cc4\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.267812 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dac052a-7e93-4343-901a-6b0cfb885cc4-serving-cert\") pod \"4dac052a-7e93-4343-901a-6b0cfb885cc4\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.268442 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88004414-de81-4e3c-9f3f-99f90a3bbc98-serviceca" (OuterVolumeSpecName: "serviceca") pod "88004414-de81-4e3c-9f3f-99f90a3bbc98" (UID: "88004414-de81-4e3c-9f3f-99f90a3bbc98"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.268643 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-config" (OuterVolumeSpecName: "config") pod "4dac052a-7e93-4343-901a-6b0cfb885cc4" (UID: "4dac052a-7e93-4343-901a-6b0cfb885cc4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.269149 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-client-ca" (OuterVolumeSpecName: "client-ca") pod "4dac052a-7e93-4343-901a-6b0cfb885cc4" (UID: "4dac052a-7e93-4343-901a-6b0cfb885cc4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.274017 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88004414-de81-4e3c-9f3f-99f90a3bbc98-kube-api-access-lnckx" (OuterVolumeSpecName: "kube-api-access-lnckx") pod "88004414-de81-4e3c-9f3f-99f90a3bbc98" (UID: "88004414-de81-4e3c-9f3f-99f90a3bbc98"). InnerVolumeSpecName "kube-api-access-lnckx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.274071 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dac052a-7e93-4343-901a-6b0cfb885cc4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4dac052a-7e93-4343-901a-6b0cfb885cc4" (UID: "4dac052a-7e93-4343-901a-6b0cfb885cc4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.282781 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dac052a-7e93-4343-901a-6b0cfb885cc4-kube-api-access-g62l6" (OuterVolumeSpecName: "kube-api-access-g62l6") pod "4dac052a-7e93-4343-901a-6b0cfb885cc4" (UID: "4dac052a-7e93-4343-901a-6b0cfb885cc4"). InnerVolumeSpecName "kube-api-access-g62l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.369933 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.369994 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnckx\" (UniqueName: \"kubernetes.io/projected/88004414-de81-4e3c-9f3f-99f90a3bbc98-kube-api-access-lnckx\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.370009 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.370019 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dac052a-7e93-4343-901a-6b0cfb885cc4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.370029 4745 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/88004414-de81-4e3c-9f3f-99f90a3bbc98-serviceca\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.370038 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g62l6\" (UniqueName: \"kubernetes.io/projected/4dac052a-7e93-4343-901a-6b0cfb885cc4-kube-api-access-g62l6\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.582293 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk"] Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.585660 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk"] Mar 19 00:11:24 crc kubenswrapper[4745]: I0319 00:11:24.148387 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dac052a-7e93-4343-901a-6b0cfb885cc4" path="/var/lib/kubelet/pods/4dac052a-7e93-4343-901a-6b0cfb885cc4/volumes" Mar 19 00:11:26 crc kubenswrapper[4745]: I0319 00:11:26.957541 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw"] Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.076241 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q"] Mar 19 00:11:27 crc kubenswrapper[4745]: E0319 00:11:27.076467 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dac052a-7e93-4343-901a-6b0cfb885cc4" containerName="route-controller-manager" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.076480 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dac052a-7e93-4343-901a-6b0cfb885cc4" containerName="route-controller-manager" Mar 19 00:11:27 crc kubenswrapper[4745]: E0319 00:11:27.076493 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88004414-de81-4e3c-9f3f-99f90a3bbc98" containerName="image-pruner" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.076500 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="88004414-de81-4e3c-9f3f-99f90a3bbc98" containerName="image-pruner" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.076618 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dac052a-7e93-4343-901a-6b0cfb885cc4" containerName="route-controller-manager" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.076631 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="88004414-de81-4e3c-9f3f-99f90a3bbc98" containerName="image-pruner" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.077045 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.080035 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.080873 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.081195 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.082723 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.084255 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.085840 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.129959 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q"] Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.130212 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7qbm\" (UniqueName: \"kubernetes.io/projected/9c10585a-574b-4a55-8b88-9997418b9e02-kube-api-access-h7qbm\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.130249 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c10585a-574b-4a55-8b88-9997418b9e02-serving-cert\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.130280 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-config\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.130303 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-client-ca\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.231704 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7qbm\" (UniqueName: \"kubernetes.io/projected/9c10585a-574b-4a55-8b88-9997418b9e02-kube-api-access-h7qbm\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.231753 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c10585a-574b-4a55-8b88-9997418b9e02-serving-cert\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.231784 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-config\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.231810 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-client-ca\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.233383 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-config\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.233453 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-client-ca\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.241812 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c10585a-574b-4a55-8b88-9997418b9e02-serving-cert\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.249036 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7qbm\" (UniqueName: \"kubernetes.io/projected/9c10585a-574b-4a55-8b88-9997418b9e02-kube-api-access-h7qbm\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.393830 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: E0319 00:11:27.596461 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 19 00:11:27 crc kubenswrapper[4745]: E0319 00:11:27.596982 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rkp49,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-kgsn7_openshift-marketplace(09d29a41-94df-42b0-b7d3-6b47b06a238f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:11:27 crc kubenswrapper[4745]: E0319 00:11:27.598447 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-kgsn7" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.919266 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.924329 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.925228 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 00:11:28 crc kubenswrapper[4745]: I0319 00:11:28.046303 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb131d37-4be2-4843-9ed6-21fc0636b07f-kube-api-access\") pod \"installer-9-crc\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:28 crc kubenswrapper[4745]: I0319 00:11:28.046380 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:28 crc kubenswrapper[4745]: I0319 00:11:28.046441 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-var-lock\") pod \"installer-9-crc\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:28 crc kubenswrapper[4745]: I0319 00:11:28.147229 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb131d37-4be2-4843-9ed6-21fc0636b07f-kube-api-access\") pod \"installer-9-crc\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:28 crc kubenswrapper[4745]: I0319 00:11:28.147288 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:28 crc kubenswrapper[4745]: I0319 00:11:28.147339 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-var-lock\") pod \"installer-9-crc\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:28 crc kubenswrapper[4745]: I0319 00:11:28.147421 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-var-lock\") pod \"installer-9-crc\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:28 crc kubenswrapper[4745]: I0319 00:11:28.147453 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:28 crc kubenswrapper[4745]: I0319 00:11:28.172210 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb131d37-4be2-4843-9ed6-21fc0636b07f-kube-api-access\") pod \"installer-9-crc\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:28 crc kubenswrapper[4745]: I0319 00:11:28.248148 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:31 crc kubenswrapper[4745]: E0319 00:11:31.256649 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-kgsn7" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" Mar 19 00:11:31 crc kubenswrapper[4745]: E0319 00:11:31.355695 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 19 00:11:31 crc kubenswrapper[4745]: E0319 00:11:31.356146 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rttcj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-75vmv_openshift-marketplace(71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:11:31 crc kubenswrapper[4745]: E0319 00:11:31.357672 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-75vmv" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" Mar 19 00:11:31 crc kubenswrapper[4745]: E0319 00:11:31.360015 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 19 00:11:31 crc kubenswrapper[4745]: E0319 00:11:31.360130 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdqmk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-q9zn6_openshift-marketplace(c21b8175-025a-4d91-ad43-389dbad40846): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:11:31 crc kubenswrapper[4745]: E0319 00:11:31.361346 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-q9zn6" podUID="c21b8175-025a-4d91-ad43-389dbad40846" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.670413 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-75vmv" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.672634 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q9zn6" podUID="c21b8175-025a-4d91-ad43-389dbad40846" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.756075 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.756556 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sqj2f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mtjq5_openshift-marketplace(2c3c406d-9994-4629-b585-4d145b1e04aa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.757859 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mtjq5" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.761086 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.761240 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8sq28,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-wf9ss_openshift-marketplace(04cc89b5-7bac-4b91-bb97-a1f5ab14260c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.762814 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-wf9ss" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.778076 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.778235 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mdzp9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cgghw_openshift-marketplace(b19d4fad-672f-40f3-bfdb-53b36da06399): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.779619 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-cgghw" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" Mar 19 00:11:33 crc kubenswrapper[4745]: I0319 00:11:33.043561 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw"] Mar 19 00:11:33 crc kubenswrapper[4745]: I0319 00:11:33.361011 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:11:34 crc kubenswrapper[4745]: E0319 00:11:34.361854 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-wf9ss" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" Mar 19 00:11:34 crc kubenswrapper[4745]: E0319 00:11:34.361961 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mtjq5" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" Mar 19 00:11:34 crc kubenswrapper[4745]: E0319 00:11:34.362020 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cgghw" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" Mar 19 00:11:34 crc kubenswrapper[4745]: W0319 00:11:34.394315 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98407a3b_8601_4632_b5b0_9308cfe2dbb6.slice/crio-9f0c65e6b1640a3ecd576682733bb4dc9007cf419c28d600a351e0baa6eb0fc3 WatchSource:0}: Error finding container 9f0c65e6b1640a3ecd576682733bb4dc9007cf419c28d600a351e0baa6eb0fc3: Status 404 returned error can't find the container with id 9f0c65e6b1640a3ecd576682733bb4dc9007cf419c28d600a351e0baa6eb0fc3 Mar 19 00:11:34 crc kubenswrapper[4745]: E0319 00:11:34.468245 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 19 00:11:34 crc kubenswrapper[4745]: E0319 00:11:34.468522 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-82wwh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-hhfzg_openshift-marketplace(0c1d22d3-b584-4622-856c-b531a5d1ad5d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:11:34 crc kubenswrapper[4745]: E0319 00:11:34.469691 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-hhfzg" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" Mar 19 00:11:34 crc kubenswrapper[4745]: E0319 00:11:34.499233 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 19 00:11:34 crc kubenswrapper[4745]: E0319 00:11:34.499365 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-695vg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-g5dw2_openshift-marketplace(ee0bf814-e571-41fe-9265-b77d8b53e20f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:11:34 crc kubenswrapper[4745]: E0319 00:11:34.500925 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-g5dw2" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" Mar 19 00:11:34 crc kubenswrapper[4745]: I0319 00:11:34.572551 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 00:11:34 crc kubenswrapper[4745]: I0319 00:11:34.828507 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 00:11:34 crc kubenswrapper[4745]: I0319 00:11:34.837304 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q"] Mar 19 00:11:34 crc kubenswrapper[4745]: W0319 00:11:34.837457 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcb131d37_4be2_4843_9ed6_21fc0636b07f.slice/crio-0687eea174b3e10ed9c2288a4a55fbe08d67199218b910f91a8f0259a91524d7 WatchSource:0}: Error finding container 0687eea174b3e10ed9c2288a4a55fbe08d67199218b910f91a8f0259a91524d7: Status 404 returned error can't find the container with id 0687eea174b3e10ed9c2288a4a55fbe08d67199218b910f91a8f0259a91524d7 Mar 19 00:11:34 crc kubenswrapper[4745]: W0319 00:11:34.839993 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c10585a_574b_4a55_8b88_9997418b9e02.slice/crio-77173caa51e84ae390dbeedd6df17d40cd949d3139f39b0e78d88f448392b2b6 WatchSource:0}: Error finding container 77173caa51e84ae390dbeedd6df17d40cd949d3139f39b0e78d88f448392b2b6: Status 404 returned error can't find the container with id 77173caa51e84ae390dbeedd6df17d40cd949d3139f39b0e78d88f448392b2b6 Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.320443 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" event={"ID":"9c10585a-574b-4a55-8b88-9997418b9e02","Type":"ContainerStarted","Data":"77173caa51e84ae390dbeedd6df17d40cd949d3139f39b0e78d88f448392b2b6"} Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.321606 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cb131d37-4be2-4843-9ed6-21fc0636b07f","Type":"ContainerStarted","Data":"0687eea174b3e10ed9c2288a4a55fbe08d67199218b910f91a8f0259a91524d7"} Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.324082 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" event={"ID":"98407a3b-8601-4632-b5b0-9308cfe2dbb6","Type":"ContainerStarted","Data":"b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612"} Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.324342 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" podUID="98407a3b-8601-4632-b5b0-9308cfe2dbb6" containerName="controller-manager" containerID="cri-o://b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612" gracePeriod=30 Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.324346 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" event={"ID":"98407a3b-8601-4632-b5b0-9308cfe2dbb6","Type":"ContainerStarted","Data":"9f0c65e6b1640a3ecd576682733bb4dc9007cf419c28d600a351e0baa6eb0fc3"} Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.324553 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.328347 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5698031f-9dc1-4457-a866-2fd312ebfa9e","Type":"ContainerStarted","Data":"0f87fd30d76b53dcac185b3503fb308808afdfd323c801d8401fbd4b0ed01bc0"} Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.328387 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5698031f-9dc1-4457-a866-2fd312ebfa9e","Type":"ContainerStarted","Data":"d51d30df499606a711683bc76abba35055a73b43b636926a63aab2f8353386ae"} Mar 19 00:11:35 crc kubenswrapper[4745]: E0319 00:11:35.345286 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-hhfzg" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" Mar 19 00:11:35 crc kubenswrapper[4745]: E0319 00:11:35.345464 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-g5dw2" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.348679 4745 patch_prober.go:28] interesting pod/controller-manager-66fbb79cf5-mhgjw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": read tcp 10.217.0.2:51696->10.217.0.58:8443: read: connection reset by peer" start-of-body= Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.349504 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" podUID="98407a3b-8601-4632-b5b0-9308cfe2dbb6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": read tcp 10.217.0.2:51696->10.217.0.58:8443: read: connection reset by peer" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.358070 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" podStartSLOduration=29.358043924 podStartE2EDuration="29.358043924s" podCreationTimestamp="2026-03-19 00:11:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:11:35.351775206 +0000 UTC m=+259.889970347" watchObservedRunningTime="2026-03-19 00:11:35.358043924 +0000 UTC m=+259.896239055" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.413273 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=13.413254618 podStartE2EDuration="13.413254618s" podCreationTimestamp="2026-03-19 00:11:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:11:35.393078888 +0000 UTC m=+259.931274029" watchObservedRunningTime="2026-03-19 00:11:35.413254618 +0000 UTC m=+259.951449749" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.694931 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.713052 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-config\") pod \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.713105 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-proxy-ca-bundles\") pod \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.713154 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp7vc\" (UniqueName: \"kubernetes.io/projected/98407a3b-8601-4632-b5b0-9308cfe2dbb6-kube-api-access-xp7vc\") pod \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.713180 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98407a3b-8601-4632-b5b0-9308cfe2dbb6-serving-cert\") pod \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.713213 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-client-ca\") pod \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.714018 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-client-ca" (OuterVolumeSpecName: "client-ca") pod "98407a3b-8601-4632-b5b0-9308cfe2dbb6" (UID: "98407a3b-8601-4632-b5b0-9308cfe2dbb6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.714085 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-config" (OuterVolumeSpecName: "config") pod "98407a3b-8601-4632-b5b0-9308cfe2dbb6" (UID: "98407a3b-8601-4632-b5b0-9308cfe2dbb6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.714562 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "98407a3b-8601-4632-b5b0-9308cfe2dbb6" (UID: "98407a3b-8601-4632-b5b0-9308cfe2dbb6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.723560 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98407a3b-8601-4632-b5b0-9308cfe2dbb6-kube-api-access-xp7vc" (OuterVolumeSpecName: "kube-api-access-xp7vc") pod "98407a3b-8601-4632-b5b0-9308cfe2dbb6" (UID: "98407a3b-8601-4632-b5b0-9308cfe2dbb6"). InnerVolumeSpecName "kube-api-access-xp7vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.727108 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98407a3b-8601-4632-b5b0-9308cfe2dbb6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "98407a3b-8601-4632-b5b0-9308cfe2dbb6" (UID: "98407a3b-8601-4632-b5b0-9308cfe2dbb6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.730512 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c4857fb46-nxt5t"] Mar 19 00:11:35 crc kubenswrapper[4745]: E0319 00:11:35.730790 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98407a3b-8601-4632-b5b0-9308cfe2dbb6" containerName="controller-manager" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.730807 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="98407a3b-8601-4632-b5b0-9308cfe2dbb6" containerName="controller-manager" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.730947 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="98407a3b-8601-4632-b5b0-9308cfe2dbb6" containerName="controller-manager" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.731314 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.746749 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c4857fb46-nxt5t"] Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.814393 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-config\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.814443 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-proxy-ca-bundles\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.814463 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ce067d-78a6-4ed3-9295-fb73f2b931fb-serving-cert\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.814667 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvgx5\" (UniqueName: \"kubernetes.io/projected/14ce067d-78a6-4ed3-9295-fb73f2b931fb-kube-api-access-hvgx5\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.814733 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-client-ca\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.814910 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98407a3b-8601-4632-b5b0-9308cfe2dbb6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.814925 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.814936 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.814946 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.814956 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp7vc\" (UniqueName: \"kubernetes.io/projected/98407a3b-8601-4632-b5b0-9308cfe2dbb6-kube-api-access-xp7vc\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.916284 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-config\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.916332 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-proxy-ca-bundles\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.916358 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ce067d-78a6-4ed3-9295-fb73f2b931fb-serving-cert\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.916417 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvgx5\" (UniqueName: \"kubernetes.io/projected/14ce067d-78a6-4ed3-9295-fb73f2b931fb-kube-api-access-hvgx5\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.916444 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-client-ca\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.917716 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-proxy-ca-bundles\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.917951 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-config\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.918265 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-client-ca\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.921267 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ce067d-78a6-4ed3-9295-fb73f2b931fb-serving-cert\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.934403 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvgx5\" (UniqueName: \"kubernetes.io/projected/14ce067d-78a6-4ed3-9295-fb73f2b931fb-kube-api-access-hvgx5\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.086081 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.267737 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c4857fb46-nxt5t"] Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.336601 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" event={"ID":"9c10585a-574b-4a55-8b88-9997418b9e02","Type":"ContainerStarted","Data":"58ce3dd279b59a910658de2cc809956b25c1c5a660cb3d2371dd1d7acd584281"} Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.336981 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.337710 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" event={"ID":"14ce067d-78a6-4ed3-9295-fb73f2b931fb","Type":"ContainerStarted","Data":"4df8c028721f86f64f802110e34a6846149248c20cfe8d5077d6d03475ad3327"} Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.341744 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cb131d37-4be2-4843-9ed6-21fc0636b07f","Type":"ContainerStarted","Data":"e4a052c3c9127a082f3e369e3635dfa6ffb8b29174c2b0096d5b731709aa71d5"} Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.343792 4745 generic.go:334] "Generic (PLEG): container finished" podID="98407a3b-8601-4632-b5b0-9308cfe2dbb6" containerID="b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612" exitCode=0 Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.343845 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.343899 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" event={"ID":"98407a3b-8601-4632-b5b0-9308cfe2dbb6","Type":"ContainerDied","Data":"b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612"} Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.343957 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" event={"ID":"98407a3b-8601-4632-b5b0-9308cfe2dbb6","Type":"ContainerDied","Data":"9f0c65e6b1640a3ecd576682733bb4dc9007cf419c28d600a351e0baa6eb0fc3"} Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.343977 4745 scope.go:117] "RemoveContainer" containerID="b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612" Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.344816 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.347119 4745 generic.go:334] "Generic (PLEG): container finished" podID="5698031f-9dc1-4457-a866-2fd312ebfa9e" containerID="0f87fd30d76b53dcac185b3503fb308808afdfd323c801d8401fbd4b0ed01bc0" exitCode=0 Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.347163 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5698031f-9dc1-4457-a866-2fd312ebfa9e","Type":"ContainerDied","Data":"0f87fd30d76b53dcac185b3503fb308808afdfd323c801d8401fbd4b0ed01bc0"} Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.358380 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" podStartSLOduration=9.358361425 podStartE2EDuration="9.358361425s" podCreationTimestamp="2026-03-19 00:11:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:11:36.354366501 +0000 UTC m=+260.892561632" watchObservedRunningTime="2026-03-19 00:11:36.358361425 +0000 UTC m=+260.896556566" Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.363805 4745 scope.go:117] "RemoveContainer" containerID="b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612" Mar 19 00:11:36 crc kubenswrapper[4745]: E0319 00:11:36.364166 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612\": container with ID starting with b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612 not found: ID does not exist" containerID="b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612" Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.364211 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612"} err="failed to get container status \"b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612\": rpc error: code = NotFound desc = could not find container \"b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612\": container with ID starting with b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612 not found: ID does not exist" Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.408778 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=9.408750848 podStartE2EDuration="9.408750848s" podCreationTimestamp="2026-03-19 00:11:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:11:36.406827735 +0000 UTC m=+260.945022866" watchObservedRunningTime="2026-03-19 00:11:36.408750848 +0000 UTC m=+260.946945979" Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.441128 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw"] Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.447073 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw"] Mar 19 00:11:37 crc kubenswrapper[4745]: I0319 00:11:37.362137 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" event={"ID":"14ce067d-78a6-4ed3-9295-fb73f2b931fb","Type":"ContainerStarted","Data":"cd7d8b9f0a89687fb705d0a182049cc385f2250f150a6c876b33a23f8286e636"} Mar 19 00:11:37 crc kubenswrapper[4745]: I0319 00:11:37.662533 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 00:11:37 crc kubenswrapper[4745]: I0319 00:11:37.679264 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" podStartSLOduration=11.679238504 podStartE2EDuration="11.679238504s" podCreationTimestamp="2026-03-19 00:11:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:11:37.388036706 +0000 UTC m=+261.926231847" watchObservedRunningTime="2026-03-19 00:11:37.679238504 +0000 UTC m=+262.217433635" Mar 19 00:11:37 crc kubenswrapper[4745]: I0319 00:11:37.749136 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5698031f-9dc1-4457-a866-2fd312ebfa9e-kubelet-dir\") pod \"5698031f-9dc1-4457-a866-2fd312ebfa9e\" (UID: \"5698031f-9dc1-4457-a866-2fd312ebfa9e\") " Mar 19 00:11:37 crc kubenswrapper[4745]: I0319 00:11:37.749242 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5698031f-9dc1-4457-a866-2fd312ebfa9e-kube-api-access\") pod \"5698031f-9dc1-4457-a866-2fd312ebfa9e\" (UID: \"5698031f-9dc1-4457-a866-2fd312ebfa9e\") " Mar 19 00:11:37 crc kubenswrapper[4745]: I0319 00:11:37.749295 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5698031f-9dc1-4457-a866-2fd312ebfa9e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5698031f-9dc1-4457-a866-2fd312ebfa9e" (UID: "5698031f-9dc1-4457-a866-2fd312ebfa9e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:11:37 crc kubenswrapper[4745]: I0319 00:11:37.749608 4745 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5698031f-9dc1-4457-a866-2fd312ebfa9e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:37 crc kubenswrapper[4745]: I0319 00:11:37.756550 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5698031f-9dc1-4457-a866-2fd312ebfa9e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5698031f-9dc1-4457-a866-2fd312ebfa9e" (UID: "5698031f-9dc1-4457-a866-2fd312ebfa9e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:11:37 crc kubenswrapper[4745]: I0319 00:11:37.851174 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5698031f-9dc1-4457-a866-2fd312ebfa9e-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:38 crc kubenswrapper[4745]: I0319 00:11:38.145622 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98407a3b-8601-4632-b5b0-9308cfe2dbb6" path="/var/lib/kubelet/pods/98407a3b-8601-4632-b5b0-9308cfe2dbb6/volumes" Mar 19 00:11:38 crc kubenswrapper[4745]: I0319 00:11:38.368206 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5698031f-9dc1-4457-a866-2fd312ebfa9e","Type":"ContainerDied","Data":"d51d30df499606a711683bc76abba35055a73b43b636926a63aab2f8353386ae"} Mar 19 00:11:38 crc kubenswrapper[4745]: I0319 00:11:38.368248 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d51d30df499606a711683bc76abba35055a73b43b636926a63aab2f8353386ae" Mar 19 00:11:38 crc kubenswrapper[4745]: I0319 00:11:38.368480 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:38 crc kubenswrapper[4745]: I0319 00:11:38.369311 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 00:11:38 crc kubenswrapper[4745]: I0319 00:11:38.374418 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:41 crc kubenswrapper[4745]: I0319 00:11:41.385425 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" event={"ID":"d14d18f5-0177-4458-8ea3-b266cc96d658","Type":"ContainerStarted","Data":"76cb550291e5cd7aae935eea1a8dd025dfbc6f11748c2597964f9ad53d8ac6b0"} Mar 19 00:11:41 crc kubenswrapper[4745]: I0319 00:11:41.404064 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" podStartSLOduration=45.706455093 podStartE2EDuration="1m41.404038501s" podCreationTimestamp="2026-03-19 00:10:00 +0000 UTC" firstStartedPulling="2026-03-19 00:10:45.253765238 +0000 UTC m=+209.791960369" lastFinishedPulling="2026-03-19 00:11:40.951348646 +0000 UTC m=+265.489543777" observedRunningTime="2026-03-19 00:11:41.399268778 +0000 UTC m=+265.937463929" watchObservedRunningTime="2026-03-19 00:11:41.404038501 +0000 UTC m=+265.942233632" Mar 19 00:11:41 crc kubenswrapper[4745]: I0319 00:11:41.589405 4745 csr.go:261] certificate signing request csr-r6h9t is approved, waiting to be issued Mar 19 00:11:41 crc kubenswrapper[4745]: I0319 00:11:41.595812 4745 csr.go:257] certificate signing request csr-r6h9t is issued Mar 19 00:11:42 crc kubenswrapper[4745]: I0319 00:11:42.392768 4745 generic.go:334] "Generic (PLEG): container finished" podID="d14d18f5-0177-4458-8ea3-b266cc96d658" containerID="76cb550291e5cd7aae935eea1a8dd025dfbc6f11748c2597964f9ad53d8ac6b0" exitCode=0 Mar 19 00:11:42 crc kubenswrapper[4745]: I0319 00:11:42.392818 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" event={"ID":"d14d18f5-0177-4458-8ea3-b266cc96d658","Type":"ContainerDied","Data":"76cb550291e5cd7aae935eea1a8dd025dfbc6f11748c2597964f9ad53d8ac6b0"} Mar 19 00:11:42 crc kubenswrapper[4745]: I0319 00:11:42.597525 4745 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-06 08:37:05.371483213 +0000 UTC Mar 19 00:11:42 crc kubenswrapper[4745]: I0319 00:11:42.597903 4745 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7040h25m22.773583555s for next certificate rotation Mar 19 00:11:43 crc kubenswrapper[4745]: I0319 00:11:43.598975 4745 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-02 09:58:54.517442097 +0000 UTC Mar 19 00:11:43 crc kubenswrapper[4745]: I0319 00:11:43.599014 4745 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6201h47m10.918430965s for next certificate rotation Mar 19 00:11:43 crc kubenswrapper[4745]: I0319 00:11:43.693394 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" Mar 19 00:11:43 crc kubenswrapper[4745]: I0319 00:11:43.751566 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5kr4\" (UniqueName: \"kubernetes.io/projected/d14d18f5-0177-4458-8ea3-b266cc96d658-kube-api-access-c5kr4\") pod \"d14d18f5-0177-4458-8ea3-b266cc96d658\" (UID: \"d14d18f5-0177-4458-8ea3-b266cc96d658\") " Mar 19 00:11:43 crc kubenswrapper[4745]: I0319 00:11:43.757027 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d14d18f5-0177-4458-8ea3-b266cc96d658-kube-api-access-c5kr4" (OuterVolumeSpecName: "kube-api-access-c5kr4") pod "d14d18f5-0177-4458-8ea3-b266cc96d658" (UID: "d14d18f5-0177-4458-8ea3-b266cc96d658"). InnerVolumeSpecName "kube-api-access-c5kr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:11:43 crc kubenswrapper[4745]: I0319 00:11:43.852634 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5kr4\" (UniqueName: \"kubernetes.io/projected/d14d18f5-0177-4458-8ea3-b266cc96d658-kube-api-access-c5kr4\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:44 crc kubenswrapper[4745]: I0319 00:11:44.402423 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" event={"ID":"d14d18f5-0177-4458-8ea3-b266cc96d658","Type":"ContainerDied","Data":"158cd16e86d97bd741b6d9d3a091e473262326dad949142d4f04bb6f64676d4b"} Mar 19 00:11:44 crc kubenswrapper[4745]: I0319 00:11:44.402728 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="158cd16e86d97bd741b6d9d3a091e473262326dad949142d4f04bb6f64676d4b" Mar 19 00:11:44 crc kubenswrapper[4745]: I0319 00:11:44.402785 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" Mar 19 00:11:45 crc kubenswrapper[4745]: I0319 00:11:45.410444 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9zn6" event={"ID":"c21b8175-025a-4d91-ad43-389dbad40846","Type":"ContainerStarted","Data":"3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9"} Mar 19 00:11:45 crc kubenswrapper[4745]: I0319 00:11:45.606399 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:11:45 crc kubenswrapper[4745]: I0319 00:11:45.606753 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:11:45 crc kubenswrapper[4745]: I0319 00:11:45.606799 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:11:45 crc kubenswrapper[4745]: I0319 00:11:45.607413 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574"} pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 00:11:45 crc kubenswrapper[4745]: I0319 00:11:45.607491 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" containerID="cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574" gracePeriod=600 Mar 19 00:11:46 crc kubenswrapper[4745]: I0319 00:11:46.416803 4745 generic.go:334] "Generic (PLEG): container finished" podID="400972f4-050f-4f26-b982-ced6f2590c8b" containerID="e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574" exitCode=0 Mar 19 00:11:46 crc kubenswrapper[4745]: I0319 00:11:46.416921 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerDied","Data":"e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574"} Mar 19 00:11:46 crc kubenswrapper[4745]: I0319 00:11:46.418990 4745 generic.go:334] "Generic (PLEG): container finished" podID="c21b8175-025a-4d91-ad43-389dbad40846" containerID="3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9" exitCode=0 Mar 19 00:11:46 crc kubenswrapper[4745]: I0319 00:11:46.419018 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9zn6" event={"ID":"c21b8175-025a-4d91-ad43-389dbad40846","Type":"ContainerDied","Data":"3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9"} Mar 19 00:11:46 crc kubenswrapper[4745]: I0319 00:11:46.956693 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c4857fb46-nxt5t"] Mar 19 00:11:46 crc kubenswrapper[4745]: I0319 00:11:46.956988 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" podUID="14ce067d-78a6-4ed3-9295-fb73f2b931fb" containerName="controller-manager" containerID="cri-o://cd7d8b9f0a89687fb705d0a182049cc385f2250f150a6c876b33a23f8286e636" gracePeriod=30 Mar 19 00:11:46 crc kubenswrapper[4745]: I0319 00:11:46.971699 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q"] Mar 19 00:11:46 crc kubenswrapper[4745]: I0319 00:11:46.972571 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" podUID="9c10585a-574b-4a55-8b88-9997418b9e02" containerName="route-controller-manager" containerID="cri-o://58ce3dd279b59a910658de2cc809956b25c1c5a660cb3d2371dd1d7acd584281" gracePeriod=30 Mar 19 00:11:47 crc kubenswrapper[4745]: I0319 00:11:47.394631 4745 patch_prober.go:28] interesting pod/route-controller-manager-65b7fbb54-twj8q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" start-of-body= Mar 19 00:11:47 crc kubenswrapper[4745]: I0319 00:11:47.394736 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" podUID="9c10585a-574b-4a55-8b88-9997418b9e02" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" Mar 19 00:11:48 crc kubenswrapper[4745]: I0319 00:11:48.434512 4745 generic.go:334] "Generic (PLEG): container finished" podID="9c10585a-574b-4a55-8b88-9997418b9e02" containerID="58ce3dd279b59a910658de2cc809956b25c1c5a660cb3d2371dd1d7acd584281" exitCode=0 Mar 19 00:11:48 crc kubenswrapper[4745]: I0319 00:11:48.434593 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" event={"ID":"9c10585a-574b-4a55-8b88-9997418b9e02","Type":"ContainerDied","Data":"58ce3dd279b59a910658de2cc809956b25c1c5a660cb3d2371dd1d7acd584281"} Mar 19 00:11:48 crc kubenswrapper[4745]: I0319 00:11:48.436049 4745 generic.go:334] "Generic (PLEG): container finished" podID="14ce067d-78a6-4ed3-9295-fb73f2b931fb" containerID="cd7d8b9f0a89687fb705d0a182049cc385f2250f150a6c876b33a23f8286e636" exitCode=0 Mar 19 00:11:48 crc kubenswrapper[4745]: I0319 00:11:48.436073 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" event={"ID":"14ce067d-78a6-4ed3-9295-fb73f2b931fb","Type":"ContainerDied","Data":"cd7d8b9f0a89687fb705d0a182049cc385f2250f150a6c876b33a23f8286e636"} Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.794508 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.829688 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz"] Mar 19 00:11:51 crc kubenswrapper[4745]: E0319 00:11:51.830130 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5698031f-9dc1-4457-a866-2fd312ebfa9e" containerName="pruner" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.830150 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="5698031f-9dc1-4457-a866-2fd312ebfa9e" containerName="pruner" Mar 19 00:11:51 crc kubenswrapper[4745]: E0319 00:11:51.830167 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d14d18f5-0177-4458-8ea3-b266cc96d658" containerName="oc" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.830174 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d14d18f5-0177-4458-8ea3-b266cc96d658" containerName="oc" Mar 19 00:11:51 crc kubenswrapper[4745]: E0319 00:11:51.830185 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c10585a-574b-4a55-8b88-9997418b9e02" containerName="route-controller-manager" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.830192 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c10585a-574b-4a55-8b88-9997418b9e02" containerName="route-controller-manager" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.830320 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d14d18f5-0177-4458-8ea3-b266cc96d658" containerName="oc" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.830334 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="5698031f-9dc1-4457-a866-2fd312ebfa9e" containerName="pruner" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.830346 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c10585a-574b-4a55-8b88-9997418b9e02" containerName="route-controller-manager" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.830807 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.834379 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz"] Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.874458 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7qbm\" (UniqueName: \"kubernetes.io/projected/9c10585a-574b-4a55-8b88-9997418b9e02-kube-api-access-h7qbm\") pod \"9c10585a-574b-4a55-8b88-9997418b9e02\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.874561 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-client-ca\") pod \"9c10585a-574b-4a55-8b88-9997418b9e02\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.874701 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-config\") pod \"9c10585a-574b-4a55-8b88-9997418b9e02\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.874726 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c10585a-574b-4a55-8b88-9997418b9e02-serving-cert\") pod \"9c10585a-574b-4a55-8b88-9997418b9e02\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.876910 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-serving-cert\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.877045 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-config\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.877246 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-client-ca\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.877279 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-config" (OuterVolumeSpecName: "config") pod "9c10585a-574b-4a55-8b88-9997418b9e02" (UID: "9c10585a-574b-4a55-8b88-9997418b9e02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.877448 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxr8g\" (UniqueName: \"kubernetes.io/projected/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-kube-api-access-mxr8g\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.877617 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.877854 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-client-ca" (OuterVolumeSpecName: "client-ca") pod "9c10585a-574b-4a55-8b88-9997418b9e02" (UID: "9c10585a-574b-4a55-8b88-9997418b9e02"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.887144 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c10585a-574b-4a55-8b88-9997418b9e02-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9c10585a-574b-4a55-8b88-9997418b9e02" (UID: "9c10585a-574b-4a55-8b88-9997418b9e02"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.887164 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c10585a-574b-4a55-8b88-9997418b9e02-kube-api-access-h7qbm" (OuterVolumeSpecName: "kube-api-access-h7qbm") pod "9c10585a-574b-4a55-8b88-9997418b9e02" (UID: "9c10585a-574b-4a55-8b88-9997418b9e02"). InnerVolumeSpecName "kube-api-access-h7qbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.914759 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978423 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-proxy-ca-bundles\") pod \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978507 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-client-ca\") pod \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978530 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ce067d-78a6-4ed3-9295-fb73f2b931fb-serving-cert\") pod \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978588 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-config\") pod \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978623 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvgx5\" (UniqueName: \"kubernetes.io/projected/14ce067d-78a6-4ed3-9295-fb73f2b931fb-kube-api-access-hvgx5\") pod \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978757 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-serving-cert\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978779 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-config\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978827 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-client-ca\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978861 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxr8g\" (UniqueName: \"kubernetes.io/projected/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-kube-api-access-mxr8g\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978944 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7qbm\" (UniqueName: \"kubernetes.io/projected/9c10585a-574b-4a55-8b88-9997418b9e02-kube-api-access-h7qbm\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978959 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978970 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c10585a-574b-4a55-8b88-9997418b9e02-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.980119 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "14ce067d-78a6-4ed3-9295-fb73f2b931fb" (UID: "14ce067d-78a6-4ed3-9295-fb73f2b931fb"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.980464 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-client-ca" (OuterVolumeSpecName: "client-ca") pod "14ce067d-78a6-4ed3-9295-fb73f2b931fb" (UID: "14ce067d-78a6-4ed3-9295-fb73f2b931fb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.981946 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-config" (OuterVolumeSpecName: "config") pod "14ce067d-78a6-4ed3-9295-fb73f2b931fb" (UID: "14ce067d-78a6-4ed3-9295-fb73f2b931fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.982331 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-client-ca\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.982661 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-config\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.984080 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ce067d-78a6-4ed3-9295-fb73f2b931fb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "14ce067d-78a6-4ed3-9295-fb73f2b931fb" (UID: "14ce067d-78a6-4ed3-9295-fb73f2b931fb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.984611 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14ce067d-78a6-4ed3-9295-fb73f2b931fb-kube-api-access-hvgx5" (OuterVolumeSpecName: "kube-api-access-hvgx5") pod "14ce067d-78a6-4ed3-9295-fb73f2b931fb" (UID: "14ce067d-78a6-4ed3-9295-fb73f2b931fb"). InnerVolumeSpecName "kube-api-access-hvgx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.987819 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-serving-cert\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.994046 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxr8g\" (UniqueName: \"kubernetes.io/projected/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-kube-api-access-mxr8g\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.080862 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.080917 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvgx5\" (UniqueName: \"kubernetes.io/projected/14ce067d-78a6-4ed3-9295-fb73f2b931fb-kube-api-access-hvgx5\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.080929 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.080938 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.080948 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ce067d-78a6-4ed3-9295-fb73f2b931fb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.206818 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.460484 4745 generic.go:334] "Generic (PLEG): container finished" podID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerID="6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4" exitCode=0 Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.460565 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtjq5" event={"ID":"2c3c406d-9994-4629-b585-4d145b1e04aa","Type":"ContainerDied","Data":"6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4"} Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.462596 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.462582 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" event={"ID":"9c10585a-574b-4a55-8b88-9997418b9e02","Type":"ContainerDied","Data":"77173caa51e84ae390dbeedd6df17d40cd949d3139f39b0e78d88f448392b2b6"} Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.462751 4745 scope.go:117] "RemoveContainer" containerID="58ce3dd279b59a910658de2cc809956b25c1c5a660cb3d2371dd1d7acd584281" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.466131 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.466129 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" event={"ID":"14ce067d-78a6-4ed3-9295-fb73f2b931fb","Type":"ContainerDied","Data":"4df8c028721f86f64f802110e34a6846149248c20cfe8d5077d6d03475ad3327"} Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.468441 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"de10a1d38d98124c46ca0c82dd88e28606d5f5ae568f904a533071011c2c4c10"} Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.483636 4745 scope.go:117] "RemoveContainer" containerID="cd7d8b9f0a89687fb705d0a182049cc385f2250f150a6c876b33a23f8286e636" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.491259 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q"] Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.494825 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q"] Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.516470 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c4857fb46-nxt5t"] Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.519642 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c4857fb46-nxt5t"] Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.605735 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz"] Mar 19 00:11:52 crc kubenswrapper[4745]: W0319 00:11:52.612734 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb08b6e2_c86d_4188_94dd_5605fe96f0dc.slice/crio-7c70b71f5c22a1ac77ce6797c842aca3d59f64dd925d211bb1dfe8fe46dc9706 WatchSource:0}: Error finding container 7c70b71f5c22a1ac77ce6797c842aca3d59f64dd925d211bb1dfe8fe46dc9706: Status 404 returned error can't find the container with id 7c70b71f5c22a1ac77ce6797c842aca3d59f64dd925d211bb1dfe8fe46dc9706 Mar 19 00:11:53 crc kubenswrapper[4745]: I0319 00:11:53.480004 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" event={"ID":"fb08b6e2-c86d-4188-94dd-5605fe96f0dc","Type":"ContainerStarted","Data":"f071dc9deab1618a1875395460a8ed8c7772d6925654e51fb0ee08dc1afbdc0a"} Mar 19 00:11:53 crc kubenswrapper[4745]: I0319 00:11:53.480500 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" event={"ID":"fb08b6e2-c86d-4188-94dd-5605fe96f0dc","Type":"ContainerStarted","Data":"7c70b71f5c22a1ac77ce6797c842aca3d59f64dd925d211bb1dfe8fe46dc9706"} Mar 19 00:11:53 crc kubenswrapper[4745]: I0319 00:11:53.515756 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" podStartSLOduration=6.515721802 podStartE2EDuration="6.515721802s" podCreationTimestamp="2026-03-19 00:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:11:53.502910244 +0000 UTC m=+278.041105405" watchObservedRunningTime="2026-03-19 00:11:53.515721802 +0000 UTC m=+278.053916933" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.160354 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14ce067d-78a6-4ed3-9295-fb73f2b931fb" path="/var/lib/kubelet/pods/14ce067d-78a6-4ed3-9295-fb73f2b931fb/volumes" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.162373 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c10585a-574b-4a55-8b88-9997418b9e02" path="/var/lib/kubelet/pods/9c10585a-574b-4a55-8b88-9997418b9e02/volumes" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.489503 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgghw" event={"ID":"b19d4fad-672f-40f3-bfdb-53b36da06399","Type":"ContainerStarted","Data":"f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370"} Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.491508 4745 generic.go:334] "Generic (PLEG): container finished" podID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerID="abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3" exitCode=0 Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.491588 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhfzg" event={"ID":"0c1d22d3-b584-4622-856c-b531a5d1ad5d","Type":"ContainerDied","Data":"abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3"} Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.494714 4745 generic.go:334] "Generic (PLEG): container finished" podID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerID="0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c" exitCode=0 Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.494759 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g5dw2" event={"ID":"ee0bf814-e571-41fe-9265-b77d8b53e20f","Type":"ContainerDied","Data":"0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c"} Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.502029 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtjq5" event={"ID":"2c3c406d-9994-4629-b585-4d145b1e04aa","Type":"ContainerStarted","Data":"97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42"} Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.504332 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9zn6" event={"ID":"c21b8175-025a-4d91-ad43-389dbad40846","Type":"ContainerStarted","Data":"1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f"} Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.507308 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75vmv" event={"ID":"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0","Type":"ContainerStarted","Data":"568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9"} Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.509968 4745 generic.go:334] "Generic (PLEG): container finished" podID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerID="0f78ef6f7929f92b1cadbd575e901ee75c29f7405f9ec002e10bc0dc0774c85c" exitCode=0 Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.510040 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgsn7" event={"ID":"09d29a41-94df-42b0-b7d3-6b47b06a238f","Type":"ContainerDied","Data":"0f78ef6f7929f92b1cadbd575e901ee75c29f7405f9ec002e10bc0dc0774c85c"} Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.512240 4745 generic.go:334] "Generic (PLEG): container finished" podID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerID="982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5" exitCode=0 Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.512296 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wf9ss" event={"ID":"04cc89b5-7bac-4b91-bb97-a1f5ab14260c","Type":"ContainerDied","Data":"982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5"} Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.512655 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.520823 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.534041 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c6948f6c9-85v5z"] Mar 19 00:11:54 crc kubenswrapper[4745]: E0319 00:11:54.534340 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ce067d-78a6-4ed3-9295-fb73f2b931fb" containerName="controller-manager" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.534362 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ce067d-78a6-4ed3-9295-fb73f2b931fb" containerName="controller-manager" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.534504 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ce067d-78a6-4ed3-9295-fb73f2b931fb" containerName="controller-manager" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.535127 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.541454 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.542135 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.543165 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.596741 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.601551 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.601758 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.608153 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.614068 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c6948f6c9-85v5z"] Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.617527 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mtjq5" podStartSLOduration=3.56730852 podStartE2EDuration="1m3.617508361s" podCreationTimestamp="2026-03-19 00:10:51 +0000 UTC" firstStartedPulling="2026-03-19 00:10:53.667136009 +0000 UTC m=+218.205331140" lastFinishedPulling="2026-03-19 00:11:53.71733585 +0000 UTC m=+278.255530981" observedRunningTime="2026-03-19 00:11:54.561393249 +0000 UTC m=+279.099588380" watchObservedRunningTime="2026-03-19 00:11:54.617508361 +0000 UTC m=+279.155703492" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.619315 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-proxy-ca-bundles\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.619394 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-config\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.619456 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5r2q\" (UniqueName: \"kubernetes.io/projected/590c9ede-2a42-4251-9e05-321d560b674d-kube-api-access-f5r2q\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.619489 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/590c9ede-2a42-4251-9e05-321d560b674d-serving-cert\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.619574 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-client-ca\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.631031 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q9zn6" podStartSLOduration=4.328831615 podStartE2EDuration="1m6.631010692s" podCreationTimestamp="2026-03-19 00:10:48 +0000 UTC" firstStartedPulling="2026-03-19 00:10:51.116172097 +0000 UTC m=+215.654367228" lastFinishedPulling="2026-03-19 00:11:53.418351174 +0000 UTC m=+277.956546305" observedRunningTime="2026-03-19 00:11:54.624479163 +0000 UTC m=+279.162674294" watchObservedRunningTime="2026-03-19 00:11:54.631010692 +0000 UTC m=+279.169205823" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.721138 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5r2q\" (UniqueName: \"kubernetes.io/projected/590c9ede-2a42-4251-9e05-321d560b674d-kube-api-access-f5r2q\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.721225 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/590c9ede-2a42-4251-9e05-321d560b674d-serving-cert\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.721285 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-client-ca\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.721324 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-proxy-ca-bundles\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.721359 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-config\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.723262 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-proxy-ca-bundles\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.723257 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-client-ca\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.731728 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/590c9ede-2a42-4251-9e05-321d560b674d-serving-cert\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.754512 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5r2q\" (UniqueName: \"kubernetes.io/projected/590c9ede-2a42-4251-9e05-321d560b674d-kube-api-access-f5r2q\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.839850 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-config\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.918663 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:55 crc kubenswrapper[4745]: I0319 00:11:55.197307 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c6948f6c9-85v5z"] Mar 19 00:11:55 crc kubenswrapper[4745]: I0319 00:11:55.532385 4745 generic.go:334] "Generic (PLEG): container finished" podID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerID="568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9" exitCode=0 Mar 19 00:11:55 crc kubenswrapper[4745]: I0319 00:11:55.532904 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75vmv" event={"ID":"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0","Type":"ContainerDied","Data":"568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9"} Mar 19 00:11:55 crc kubenswrapper[4745]: I0319 00:11:55.540527 4745 generic.go:334] "Generic (PLEG): container finished" podID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerID="f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370" exitCode=0 Mar 19 00:11:55 crc kubenswrapper[4745]: I0319 00:11:55.540637 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgghw" event={"ID":"b19d4fad-672f-40f3-bfdb-53b36da06399","Type":"ContainerDied","Data":"f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370"} Mar 19 00:11:55 crc kubenswrapper[4745]: I0319 00:11:55.554222 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" event={"ID":"590c9ede-2a42-4251-9e05-321d560b674d","Type":"ContainerStarted","Data":"db8d8bd8447a0725df35d66bc2403d44409e33b72b4a06fb932c9b74861957f6"} Mar 19 00:11:55 crc kubenswrapper[4745]: I0319 00:11:55.554308 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" event={"ID":"590c9ede-2a42-4251-9e05-321d560b674d","Type":"ContainerStarted","Data":"27758a5fbf789728ef48779744e310672915539b075e2b4fbe9182cb884b96c3"} Mar 19 00:11:55 crc kubenswrapper[4745]: I0319 00:11:55.557444 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:55 crc kubenswrapper[4745]: I0319 00:11:55.560400 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:55 crc kubenswrapper[4745]: I0319 00:11:55.594177 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" podStartSLOduration=9.594155933 podStartE2EDuration="9.594155933s" podCreationTimestamp="2026-03-19 00:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:11:55.591300872 +0000 UTC m=+280.129496013" watchObservedRunningTime="2026-03-19 00:11:55.594155933 +0000 UTC m=+280.132351064" Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:56.560275 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g5dw2" event={"ID":"ee0bf814-e571-41fe-9265-b77d8b53e20f","Type":"ContainerStarted","Data":"2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b"} Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:56.579457 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g5dw2" podStartSLOduration=2.876740324 podStartE2EDuration="1m7.579437712s" podCreationTimestamp="2026-03-19 00:10:49 +0000 UTC" firstStartedPulling="2026-03-19 00:10:51.187609261 +0000 UTC m=+215.725804392" lastFinishedPulling="2026-03-19 00:11:55.890306649 +0000 UTC m=+280.428501780" observedRunningTime="2026-03-19 00:11:56.577989556 +0000 UTC m=+281.116184687" watchObservedRunningTime="2026-03-19 00:11:56.579437712 +0000 UTC m=+281.117632843" Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:57.567869 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhfzg" event={"ID":"0c1d22d3-b584-4622-856c-b531a5d1ad5d","Type":"ContainerStarted","Data":"0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9"} Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:57.586759 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hhfzg" podStartSLOduration=3.218344471 podStartE2EDuration="1m8.586741223s" podCreationTimestamp="2026-03-19 00:10:49 +0000 UTC" firstStartedPulling="2026-03-19 00:10:51.148460931 +0000 UTC m=+215.686656052" lastFinishedPulling="2026-03-19 00:11:56.516857673 +0000 UTC m=+281.055052804" observedRunningTime="2026-03-19 00:11:57.58666226 +0000 UTC m=+282.124857401" watchObservedRunningTime="2026-03-19 00:11:57.586741223 +0000 UTC m=+282.124936354" Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:59.367424 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:59.367792 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:59.559557 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:59.559859 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:59.579999 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wf9ss" event={"ID":"04cc89b5-7bac-4b91-bb97-a1f5ab14260c","Type":"ContainerStarted","Data":"71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba"} Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:59.928001 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:59.928375 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.133459 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564652-nhhsh"] Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.134142 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564652-nhhsh" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.135646 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.136436 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.140631 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.151050 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564652-nhhsh"] Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.287863 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znwm9\" (UniqueName: \"kubernetes.io/projected/9ef55829-c74d-4c78-b9b9-1c3ea05456e9-kube-api-access-znwm9\") pod \"auto-csr-approver-29564652-nhhsh\" (UID: \"9ef55829-c74d-4c78-b9b9-1c3ea05456e9\") " pod="openshift-infra/auto-csr-approver-29564652-nhhsh" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.339488 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.340492 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.342477 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.393059 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znwm9\" (UniqueName: \"kubernetes.io/projected/9ef55829-c74d-4c78-b9b9-1c3ea05456e9-kube-api-access-znwm9\") pod \"auto-csr-approver-29564652-nhhsh\" (UID: \"9ef55829-c74d-4c78-b9b9-1c3ea05456e9\") " pod="openshift-infra/auto-csr-approver-29564652-nhhsh" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.416144 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znwm9\" (UniqueName: \"kubernetes.io/projected/9ef55829-c74d-4c78-b9b9-1c3ea05456e9-kube-api-access-znwm9\") pod \"auto-csr-approver-29564652-nhhsh\" (UID: \"9ef55829-c74d-4c78-b9b9-1c3ea05456e9\") " pod="openshift-infra/auto-csr-approver-29564652-nhhsh" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.420860 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.451698 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564652-nhhsh" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.604670 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wf9ss" podStartSLOduration=4.899620489 podStartE2EDuration="1m9.604645129s" podCreationTimestamp="2026-03-19 00:10:51 +0000 UTC" firstStartedPulling="2026-03-19 00:10:53.667420379 +0000 UTC m=+218.205615510" lastFinishedPulling="2026-03-19 00:11:58.372445019 +0000 UTC m=+282.910640150" observedRunningTime="2026-03-19 00:12:00.600781776 +0000 UTC m=+285.138976907" watchObservedRunningTime="2026-03-19 00:12:00.604645129 +0000 UTC m=+285.142840260" Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.181334 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564652-nhhsh"] Mar 19 00:12:01 crc kubenswrapper[4745]: W0319 00:12:01.184240 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ef55829_c74d_4c78_b9b9_1c3ea05456e9.slice/crio-ee55d738618d399b2b12d1980b8f1fa4c6866fe88fffd45efbd52a50ad123e13 WatchSource:0}: Error finding container ee55d738618d399b2b12d1980b8f1fa4c6866fe88fffd45efbd52a50ad123e13: Status 404 returned error can't find the container with id ee55d738618d399b2b12d1980b8f1fa4c6866fe88fffd45efbd52a50ad123e13 Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.533746 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.533802 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.587480 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.595073 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75vmv" event={"ID":"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0","Type":"ContainerStarted","Data":"e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a"} Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.596872 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564652-nhhsh" event={"ID":"9ef55829-c74d-4c78-b9b9-1c3ea05456e9","Type":"ContainerStarted","Data":"ee55d738618d399b2b12d1980b8f1fa4c6866fe88fffd45efbd52a50ad123e13"} Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.645073 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.646016 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.933998 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.934394 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.972053 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:12:02 crc kubenswrapper[4745]: I0319 00:12:02.170872 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g5dw2"] Mar 19 00:12:02 crc kubenswrapper[4745]: I0319 00:12:02.605115 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgsn7" event={"ID":"09d29a41-94df-42b0-b7d3-6b47b06a238f","Type":"ContainerStarted","Data":"117f361e74f1b08965125d9bb35ef740e9e3e9deac5263171a3478a453bec6f6"} Mar 19 00:12:02 crc kubenswrapper[4745]: I0319 00:12:02.607520 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgghw" event={"ID":"b19d4fad-672f-40f3-bfdb-53b36da06399","Type":"ContainerStarted","Data":"5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c"} Mar 19 00:12:02 crc kubenswrapper[4745]: I0319 00:12:02.624449 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kgsn7" podStartSLOduration=2.467508418 podStartE2EDuration="1m13.624415847s" podCreationTimestamp="2026-03-19 00:10:49 +0000 UTC" firstStartedPulling="2026-03-19 00:10:51.134133764 +0000 UTC m=+215.672328895" lastFinishedPulling="2026-03-19 00:12:02.291041183 +0000 UTC m=+286.829236324" observedRunningTime="2026-03-19 00:12:02.623480157 +0000 UTC m=+287.161675308" watchObservedRunningTime="2026-03-19 00:12:02.624415847 +0000 UTC m=+287.162611008" Mar 19 00:12:02 crc kubenswrapper[4745]: I0319 00:12:02.644313 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cgghw" podStartSLOduration=3.283028802 podStartE2EDuration="1m10.644289402s" podCreationTimestamp="2026-03-19 00:10:52 +0000 UTC" firstStartedPulling="2026-03-19 00:10:54.881193501 +0000 UTC m=+219.419388632" lastFinishedPulling="2026-03-19 00:12:02.242454101 +0000 UTC m=+286.780649232" observedRunningTime="2026-03-19 00:12:02.642139542 +0000 UTC m=+287.180334683" watchObservedRunningTime="2026-03-19 00:12:02.644289402 +0000 UTC m=+287.182484523" Mar 19 00:12:02 crc kubenswrapper[4745]: I0319 00:12:02.668022 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-75vmv" podStartSLOduration=3.569932053 podStartE2EDuration="1m10.667999678s" podCreationTimestamp="2026-03-19 00:10:52 +0000 UTC" firstStartedPulling="2026-03-19 00:10:53.683011427 +0000 UTC m=+218.221206558" lastFinishedPulling="2026-03-19 00:12:00.781079052 +0000 UTC m=+285.319274183" observedRunningTime="2026-03-19 00:12:02.665735566 +0000 UTC m=+287.203930697" watchObservedRunningTime="2026-03-19 00:12:02.667999678 +0000 UTC m=+287.206194809" Mar 19 00:12:03 crc kubenswrapper[4745]: I0319 00:12:03.059781 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:12:03 crc kubenswrapper[4745]: I0319 00:12:03.059837 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:12:03 crc kubenswrapper[4745]: I0319 00:12:03.613093 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564652-nhhsh" event={"ID":"9ef55829-c74d-4c78-b9b9-1c3ea05456e9","Type":"ContainerStarted","Data":"5516b8a0a4bc7aafa493bd87254867dd7254eae5e71faee49575516dfd155284"} Mar 19 00:12:03 crc kubenswrapper[4745]: I0319 00:12:03.613758 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g5dw2" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerName="registry-server" containerID="cri-o://2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b" gracePeriod=2 Mar 19 00:12:03 crc kubenswrapper[4745]: I0319 00:12:03.627730 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564652-nhhsh" podStartSLOduration=1.5222021749999999 podStartE2EDuration="3.62771247s" podCreationTimestamp="2026-03-19 00:12:00 +0000 UTC" firstStartedPulling="2026-03-19 00:12:01.187685735 +0000 UTC m=+285.725880866" lastFinishedPulling="2026-03-19 00:12:03.29319603 +0000 UTC m=+287.831391161" observedRunningTime="2026-03-19 00:12:03.62521038 +0000 UTC m=+288.163405501" watchObservedRunningTime="2026-03-19 00:12:03.62771247 +0000 UTC m=+288.165907601" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.078345 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.099155 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cgghw" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerName="registry-server" probeResult="failure" output=< Mar 19 00:12:04 crc kubenswrapper[4745]: timeout: failed to connect service ":50051" within 1s Mar 19 00:12:04 crc kubenswrapper[4745]: > Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.147328 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-utilities\") pod \"ee0bf814-e571-41fe-9265-b77d8b53e20f\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.147515 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-catalog-content\") pod \"ee0bf814-e571-41fe-9265-b77d8b53e20f\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.147555 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-695vg\" (UniqueName: \"kubernetes.io/projected/ee0bf814-e571-41fe-9265-b77d8b53e20f-kube-api-access-695vg\") pod \"ee0bf814-e571-41fe-9265-b77d8b53e20f\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.148405 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-utilities" (OuterVolumeSpecName: "utilities") pod "ee0bf814-e571-41fe-9265-b77d8b53e20f" (UID: "ee0bf814-e571-41fe-9265-b77d8b53e20f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.155591 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee0bf814-e571-41fe-9265-b77d8b53e20f-kube-api-access-695vg" (OuterVolumeSpecName: "kube-api-access-695vg") pod "ee0bf814-e571-41fe-9265-b77d8b53e20f" (UID: "ee0bf814-e571-41fe-9265-b77d8b53e20f"). InnerVolumeSpecName "kube-api-access-695vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.210654 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee0bf814-e571-41fe-9265-b77d8b53e20f" (UID: "ee0bf814-e571-41fe-9265-b77d8b53e20f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.249231 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.249291 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-695vg\" (UniqueName: \"kubernetes.io/projected/ee0bf814-e571-41fe-9265-b77d8b53e20f-kube-api-access-695vg\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.249307 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.622251 4745 generic.go:334] "Generic (PLEG): container finished" podID="9ef55829-c74d-4c78-b9b9-1c3ea05456e9" containerID="5516b8a0a4bc7aafa493bd87254867dd7254eae5e71faee49575516dfd155284" exitCode=0 Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.622335 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564652-nhhsh" event={"ID":"9ef55829-c74d-4c78-b9b9-1c3ea05456e9","Type":"ContainerDied","Data":"5516b8a0a4bc7aafa493bd87254867dd7254eae5e71faee49575516dfd155284"} Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.625149 4745 generic.go:334] "Generic (PLEG): container finished" podID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerID="2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b" exitCode=0 Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.625213 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.625228 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g5dw2" event={"ID":"ee0bf814-e571-41fe-9265-b77d8b53e20f","Type":"ContainerDied","Data":"2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b"} Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.625278 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g5dw2" event={"ID":"ee0bf814-e571-41fe-9265-b77d8b53e20f","Type":"ContainerDied","Data":"5cf5d6c8b2c76c4c80fde1c17a6532692631c387bb1a224bdbe1f73591bc68b3"} Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.625300 4745 scope.go:117] "RemoveContainer" containerID="2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.646993 4745 scope.go:117] "RemoveContainer" containerID="0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.655052 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g5dw2"] Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.669553 4745 scope.go:117] "RemoveContainer" containerID="737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.672584 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g5dw2"] Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.690080 4745 scope.go:117] "RemoveContainer" containerID="2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b" Mar 19 00:12:04 crc kubenswrapper[4745]: E0319 00:12:04.690553 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b\": container with ID starting with 2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b not found: ID does not exist" containerID="2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.690609 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b"} err="failed to get container status \"2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b\": rpc error: code = NotFound desc = could not find container \"2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b\": container with ID starting with 2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b not found: ID does not exist" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.690641 4745 scope.go:117] "RemoveContainer" containerID="0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c" Mar 19 00:12:04 crc kubenswrapper[4745]: E0319 00:12:04.691188 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c\": container with ID starting with 0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c not found: ID does not exist" containerID="0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.691222 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c"} err="failed to get container status \"0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c\": rpc error: code = NotFound desc = could not find container \"0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c\": container with ID starting with 0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c not found: ID does not exist" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.691268 4745 scope.go:117] "RemoveContainer" containerID="737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925" Mar 19 00:12:04 crc kubenswrapper[4745]: E0319 00:12:04.691595 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925\": container with ID starting with 737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925 not found: ID does not exist" containerID="737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.691613 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925"} err="failed to get container status \"737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925\": rpc error: code = NotFound desc = could not find container \"737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925\": container with ID starting with 737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925 not found: ID does not exist" Mar 19 00:12:05 crc kubenswrapper[4745]: I0319 00:12:05.981404 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564652-nhhsh" Mar 19 00:12:06 crc kubenswrapper[4745]: I0319 00:12:06.146262 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" path="/var/lib/kubelet/pods/ee0bf814-e571-41fe-9265-b77d8b53e20f/volumes" Mar 19 00:12:06 crc kubenswrapper[4745]: I0319 00:12:06.172655 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znwm9\" (UniqueName: \"kubernetes.io/projected/9ef55829-c74d-4c78-b9b9-1c3ea05456e9-kube-api-access-znwm9\") pod \"9ef55829-c74d-4c78-b9b9-1c3ea05456e9\" (UID: \"9ef55829-c74d-4c78-b9b9-1c3ea05456e9\") " Mar 19 00:12:06 crc kubenswrapper[4745]: I0319 00:12:06.177103 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef55829-c74d-4c78-b9b9-1c3ea05456e9-kube-api-access-znwm9" (OuterVolumeSpecName: "kube-api-access-znwm9") pod "9ef55829-c74d-4c78-b9b9-1c3ea05456e9" (UID: "9ef55829-c74d-4c78-b9b9-1c3ea05456e9"). InnerVolumeSpecName "kube-api-access-znwm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:12:06 crc kubenswrapper[4745]: I0319 00:12:06.274376 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znwm9\" (UniqueName: \"kubernetes.io/projected/9ef55829-c74d-4c78-b9b9-1c3ea05456e9-kube-api-access-znwm9\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:06 crc kubenswrapper[4745]: I0319 00:12:06.639252 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564652-nhhsh" event={"ID":"9ef55829-c74d-4c78-b9b9-1c3ea05456e9","Type":"ContainerDied","Data":"ee55d738618d399b2b12d1980b8f1fa4c6866fe88fffd45efbd52a50ad123e13"} Mar 19 00:12:06 crc kubenswrapper[4745]: I0319 00:12:06.639606 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee55d738618d399b2b12d1980b8f1fa4c6866fe88fffd45efbd52a50ad123e13" Mar 19 00:12:06 crc kubenswrapper[4745]: I0319 00:12:06.639302 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564652-nhhsh" Mar 19 00:12:06 crc kubenswrapper[4745]: I0319 00:12:06.951565 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c6948f6c9-85v5z"] Mar 19 00:12:06 crc kubenswrapper[4745]: I0319 00:12:06.951761 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" podUID="590c9ede-2a42-4251-9e05-321d560b674d" containerName="controller-manager" containerID="cri-o://db8d8bd8447a0725df35d66bc2403d44409e33b72b4a06fb932c9b74861957f6" gracePeriod=30 Mar 19 00:12:07 crc kubenswrapper[4745]: I0319 00:12:07.044427 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz"] Mar 19 00:12:07 crc kubenswrapper[4745]: I0319 00:12:07.044626 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" podUID="fb08b6e2-c86d-4188-94dd-5605fe96f0dc" containerName="route-controller-manager" containerID="cri-o://f071dc9deab1618a1875395460a8ed8c7772d6925654e51fb0ee08dc1afbdc0a" gracePeriod=30 Mar 19 00:12:07 crc kubenswrapper[4745]: I0319 00:12:07.647937 4745 generic.go:334] "Generic (PLEG): container finished" podID="590c9ede-2a42-4251-9e05-321d560b674d" containerID="db8d8bd8447a0725df35d66bc2403d44409e33b72b4a06fb932c9b74861957f6" exitCode=0 Mar 19 00:12:07 crc kubenswrapper[4745]: I0319 00:12:07.647988 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" event={"ID":"590c9ede-2a42-4251-9e05-321d560b674d","Type":"ContainerDied","Data":"db8d8bd8447a0725df35d66bc2403d44409e33b72b4a06fb932c9b74861957f6"} Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.655570 4745 generic.go:334] "Generic (PLEG): container finished" podID="fb08b6e2-c86d-4188-94dd-5605fe96f0dc" containerID="f071dc9deab1618a1875395460a8ed8c7772d6925654e51fb0ee08dc1afbdc0a" exitCode=0 Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.655706 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" event={"ID":"fb08b6e2-c86d-4188-94dd-5605fe96f0dc","Type":"ContainerDied","Data":"f071dc9deab1618a1875395460a8ed8c7772d6925654e51fb0ee08dc1afbdc0a"} Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.703470 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.708546 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.729866 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f964689fb-zqm4l"] Mar 19 00:12:08 crc kubenswrapper[4745]: E0319 00:12:08.730127 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerName="registry-server" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.730148 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerName="registry-server" Mar 19 00:12:08 crc kubenswrapper[4745]: E0319 00:12:08.730167 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590c9ede-2a42-4251-9e05-321d560b674d" containerName="controller-manager" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.730174 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="590c9ede-2a42-4251-9e05-321d560b674d" containerName="controller-manager" Mar 19 00:12:08 crc kubenswrapper[4745]: E0319 00:12:08.730186 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb08b6e2-c86d-4188-94dd-5605fe96f0dc" containerName="route-controller-manager" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.730195 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb08b6e2-c86d-4188-94dd-5605fe96f0dc" containerName="route-controller-manager" Mar 19 00:12:08 crc kubenswrapper[4745]: E0319 00:12:08.730205 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerName="extract-content" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.730213 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerName="extract-content" Mar 19 00:12:08 crc kubenswrapper[4745]: E0319 00:12:08.730225 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerName="extract-utilities" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.730232 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerName="extract-utilities" Mar 19 00:12:08 crc kubenswrapper[4745]: E0319 00:12:08.730242 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef55829-c74d-4c78-b9b9-1c3ea05456e9" containerName="oc" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.730249 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef55829-c74d-4c78-b9b9-1c3ea05456e9" containerName="oc" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.730726 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef55829-c74d-4c78-b9b9-1c3ea05456e9" containerName="oc" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.730751 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb08b6e2-c86d-4188-94dd-5605fe96f0dc" containerName="route-controller-manager" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.730763 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="590c9ede-2a42-4251-9e05-321d560b674d" containerName="controller-manager" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.730798 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerName="registry-server" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.731647 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.744350 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f964689fb-zqm4l"] Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.803614 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-client-ca\") pod \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.803652 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-serving-cert\") pod \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.803705 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-config\") pod \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.803744 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5r2q\" (UniqueName: \"kubernetes.io/projected/590c9ede-2a42-4251-9e05-321d560b674d-kube-api-access-f5r2q\") pod \"590c9ede-2a42-4251-9e05-321d560b674d\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.803783 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxr8g\" (UniqueName: \"kubernetes.io/projected/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-kube-api-access-mxr8g\") pod \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.803804 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-client-ca\") pod \"590c9ede-2a42-4251-9e05-321d560b674d\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.803818 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/590c9ede-2a42-4251-9e05-321d560b674d-serving-cert\") pod \"590c9ede-2a42-4251-9e05-321d560b674d\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.803848 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-config\") pod \"590c9ede-2a42-4251-9e05-321d560b674d\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.803905 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-proxy-ca-bundles\") pod \"590c9ede-2a42-4251-9e05-321d560b674d\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.804681 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-client-ca" (OuterVolumeSpecName: "client-ca") pod "fb08b6e2-c86d-4188-94dd-5605fe96f0dc" (UID: "fb08b6e2-c86d-4188-94dd-5605fe96f0dc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.804749 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "590c9ede-2a42-4251-9e05-321d560b674d" (UID: "590c9ede-2a42-4251-9e05-321d560b674d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.804765 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-client-ca" (OuterVolumeSpecName: "client-ca") pod "590c9ede-2a42-4251-9e05-321d560b674d" (UID: "590c9ede-2a42-4251-9e05-321d560b674d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.804778 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-config" (OuterVolumeSpecName: "config") pod "fb08b6e2-c86d-4188-94dd-5605fe96f0dc" (UID: "fb08b6e2-c86d-4188-94dd-5605fe96f0dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.804838 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-config" (OuterVolumeSpecName: "config") pod "590c9ede-2a42-4251-9e05-321d560b674d" (UID: "590c9ede-2a42-4251-9e05-321d560b674d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.809414 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-kube-api-access-mxr8g" (OuterVolumeSpecName: "kube-api-access-mxr8g") pod "fb08b6e2-c86d-4188-94dd-5605fe96f0dc" (UID: "fb08b6e2-c86d-4188-94dd-5605fe96f0dc"). InnerVolumeSpecName "kube-api-access-mxr8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.809419 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/590c9ede-2a42-4251-9e05-321d560b674d-kube-api-access-f5r2q" (OuterVolumeSpecName: "kube-api-access-f5r2q") pod "590c9ede-2a42-4251-9e05-321d560b674d" (UID: "590c9ede-2a42-4251-9e05-321d560b674d"). InnerVolumeSpecName "kube-api-access-f5r2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.809917 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590c9ede-2a42-4251-9e05-321d560b674d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "590c9ede-2a42-4251-9e05-321d560b674d" (UID: "590c9ede-2a42-4251-9e05-321d560b674d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.810057 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fb08b6e2-c86d-4188-94dd-5605fe96f0dc" (UID: "fb08b6e2-c86d-4188-94dd-5605fe96f0dc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.904722 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/57b68939-d8b8-40ed-b961-67c46d82099e-proxy-ca-bundles\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.904803 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b68939-d8b8-40ed-b961-67c46d82099e-config\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.904824 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57b68939-d8b8-40ed-b961-67c46d82099e-serving-cert\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.904845 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn6pr\" (UniqueName: \"kubernetes.io/projected/57b68939-d8b8-40ed-b961-67c46d82099e-kube-api-access-rn6pr\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.904862 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57b68939-d8b8-40ed-b961-67c46d82099e-client-ca\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.904953 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/590c9ede-2a42-4251-9e05-321d560b674d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.904972 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.904980 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.904990 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.904999 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.905008 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.905018 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5r2q\" (UniqueName: \"kubernetes.io/projected/590c9ede-2a42-4251-9e05-321d560b674d-kube-api-access-f5r2q\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.905027 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxr8g\" (UniqueName: \"kubernetes.io/projected/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-kube-api-access-mxr8g\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.905035 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.006400 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/57b68939-d8b8-40ed-b961-67c46d82099e-proxy-ca-bundles\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.006476 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b68939-d8b8-40ed-b961-67c46d82099e-config\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.006498 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57b68939-d8b8-40ed-b961-67c46d82099e-serving-cert\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.006515 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn6pr\" (UniqueName: \"kubernetes.io/projected/57b68939-d8b8-40ed-b961-67c46d82099e-kube-api-access-rn6pr\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.006533 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57b68939-d8b8-40ed-b961-67c46d82099e-client-ca\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.007438 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57b68939-d8b8-40ed-b961-67c46d82099e-client-ca\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.007770 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/57b68939-d8b8-40ed-b961-67c46d82099e-proxy-ca-bundles\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.009020 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b68939-d8b8-40ed-b961-67c46d82099e-config\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.009776 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57b68939-d8b8-40ed-b961-67c46d82099e-serving-cert\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.026469 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn6pr\" (UniqueName: \"kubernetes.io/projected/57b68939-d8b8-40ed-b961-67c46d82099e-kube-api-access-rn6pr\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.057009 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.253404 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f964689fb-zqm4l"] Mar 19 00:12:09 crc kubenswrapper[4745]: W0319 00:12:09.260867 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57b68939_d8b8_40ed_b961_67c46d82099e.slice/crio-a03574ee4b866ae4effd82a9566cb03016697e02fbf29e652fa4ee4783ffbb83 WatchSource:0}: Error finding container a03574ee4b866ae4effd82a9566cb03016697e02fbf29e652fa4ee4783ffbb83: Status 404 returned error can't find the container with id a03574ee4b866ae4effd82a9566cb03016697e02fbf29e652fa4ee4783ffbb83 Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.598413 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.661968 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" event={"ID":"57b68939-d8b8-40ed-b961-67c46d82099e","Type":"ContainerStarted","Data":"a03574ee4b866ae4effd82a9566cb03016697e02fbf29e652fa4ee4783ffbb83"} Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.663590 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" event={"ID":"590c9ede-2a42-4251-9e05-321d560b674d","Type":"ContainerDied","Data":"27758a5fbf789728ef48779744e310672915539b075e2b4fbe9182cb884b96c3"} Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.663617 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.663648 4745 scope.go:117] "RemoveContainer" containerID="db8d8bd8447a0725df35d66bc2403d44409e33b72b4a06fb932c9b74861957f6" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.664829 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" event={"ID":"fb08b6e2-c86d-4188-94dd-5605fe96f0dc","Type":"ContainerDied","Data":"7c70b71f5c22a1ac77ce6797c842aca3d59f64dd925d211bb1dfe8fe46dc9706"} Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.664901 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.694671 4745 scope.go:117] "RemoveContainer" containerID="f071dc9deab1618a1875395460a8ed8c7772d6925654e51fb0ee08dc1afbdc0a" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.701580 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c6948f6c9-85v5z"] Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.704653 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c6948f6c9-85v5z"] Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.712335 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz"] Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.716478 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz"] Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.737513 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.737833 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.773405 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:12:10 crc kubenswrapper[4745]: I0319 00:12:10.145535 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="590c9ede-2a42-4251-9e05-321d560b674d" path="/var/lib/kubelet/pods/590c9ede-2a42-4251-9e05-321d560b674d/volumes" Mar 19 00:12:10 crc kubenswrapper[4745]: I0319 00:12:10.160234 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb08b6e2-c86d-4188-94dd-5605fe96f0dc" path="/var/lib/kubelet/pods/fb08b6e2-c86d-4188-94dd-5605fe96f0dc/volumes" Mar 19 00:12:10 crc kubenswrapper[4745]: I0319 00:12:10.480395 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-522nc"] Mar 19 00:12:10 crc kubenswrapper[4745]: I0319 00:12:10.672001 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" event={"ID":"57b68939-d8b8-40ed-b961-67c46d82099e","Type":"ContainerStarted","Data":"83aa938fad8f85173c008b4f63aed73a3eba5a81d0bd0763a971110d2227f0a7"} Mar 19 00:12:10 crc kubenswrapper[4745]: I0319 00:12:10.672201 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:10 crc kubenswrapper[4745]: I0319 00:12:10.679301 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:10 crc kubenswrapper[4745]: I0319 00:12:10.695544 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" podStartSLOduration=4.695520233 podStartE2EDuration="4.695520233s" podCreationTimestamp="2026-03-19 00:12:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:12:10.691611408 +0000 UTC m=+295.229806559" watchObservedRunningTime="2026-03-19 00:12:10.695520233 +0000 UTC m=+295.233715374" Mar 19 00:12:10 crc kubenswrapper[4745]: I0319 00:12:10.722078 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.543128 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q"] Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.544454 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.546855 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.546927 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.547040 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.547150 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.547150 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.549068 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.554801 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q"] Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.638092 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rnth\" (UniqueName: \"kubernetes.io/projected/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-kube-api-access-9rnth\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.638156 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-client-ca\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.638222 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-serving-cert\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.638248 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-config\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.739516 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-config\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.739638 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rnth\" (UniqueName: \"kubernetes.io/projected/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-kube-api-access-9rnth\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.739666 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-client-ca\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.739686 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-serving-cert\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.741074 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-client-ca\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.741482 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-config\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.748540 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-serving-cert\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.755404 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rnth\" (UniqueName: \"kubernetes.io/projected/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-kube-api-access-9rnth\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.864589 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.042274 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.380259 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q"] Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.585008 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.585374 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.620591 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.687000 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" event={"ID":"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d","Type":"ContainerStarted","Data":"6d53d3280e85a953679ea0148898d1c1ea4fc002985162e0b274152b731b26d9"} Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.722361 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.772552 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kgsn7"] Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.897832 4745 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.898519 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.898745 4745 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.899182 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6" gracePeriod=15 Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.899227 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12" gracePeriod=15 Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.899238 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5" gracePeriod=15 Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.899261 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77" gracePeriod=15 Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.899266 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210" gracePeriod=15 Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900044 4745 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 00:12:12 crc kubenswrapper[4745]: E0319 00:12:12.900264 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900276 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: E0319 00:12:12.900283 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900289 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: E0319 00:12:12.900298 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900306 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 19 00:12:12 crc kubenswrapper[4745]: E0319 00:12:12.900315 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900320 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 00:12:12 crc kubenswrapper[4745]: E0319 00:12:12.900327 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900333 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 00:12:12 crc kubenswrapper[4745]: E0319 00:12:12.900342 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900347 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 00:12:12 crc kubenswrapper[4745]: E0319 00:12:12.900359 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900365 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: E0319 00:12:12.900378 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900385 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900478 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900487 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900496 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900503 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900511 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900520 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900530 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900536 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: E0319 00:12:12.900646 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900654 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900777 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: E0319 00:12:12.900861 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900868 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.941421 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.058717 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.058863 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.058982 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.059033 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.059149 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.059174 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.059204 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.059280 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.099110 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.099831 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.100172 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.100571 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.143702 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.144199 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.144553 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.145014 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160727 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160776 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160803 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160818 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160842 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160858 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160888 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160894 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160914 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160964 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160993 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.161013 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.161034 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.161053 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.161077 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160948 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: E0319 00:12:13.161626 4745 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{route-controller-manager-687d544d6d-7tl6q.189e15ac568c56f4 openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-687d544d6d-7tl6q,UID:7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d,APIVersion:v1,ResourceVersion:30018,FieldPath:spec.containers{route-controller-manager},},Reason:Created,Message:Created container route-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:12:13.161150196 +0000 UTC m=+297.699345327,LastTimestamp:2026-03-19 00:12:13.161150196 +0000 UTC m=+297.699345327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.240746 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.457582 4745 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.457860 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.692172 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f29949657637ad0e4307613a120b4e3c081a14fbf9cd618e3408f9c26c82c283"} Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.693696 4745 generic.go:334] "Generic (PLEG): container finished" podID="cb131d37-4be2-4843-9ed6-21fc0636b07f" containerID="e4a052c3c9127a082f3e369e3635dfa6ffb8b29174c2b0096d5b731709aa71d5" exitCode=0 Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.693739 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cb131d37-4be2-4843-9ed6-21fc0636b07f","Type":"ContainerDied","Data":"e4a052c3c9127a082f3e369e3635dfa6ffb8b29174c2b0096d5b731709aa71d5"} Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.694223 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.694405 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.694556 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.694830 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.696233 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.697508 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.698412 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12" exitCode=0 Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.698433 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5" exitCode=0 Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.698442 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210" exitCode=0 Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.698450 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77" exitCode=2 Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.698527 4745 scope.go:117] "RemoveContainer" containerID="a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.702385 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kgsn7" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerName="registry-server" containerID="cri-o://117f361e74f1b08965125d9bb35ef740e9e3e9deac5263171a3478a453bec6f6" gracePeriod=2 Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.702743 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" event={"ID":"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d","Type":"ContainerStarted","Data":"ac5267107b617a38d4d92ce01c7ea3aff52496bb39844e8c8ebed6d6e23fd177"} Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.702764 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.702792 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.703168 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.703485 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.703667 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.703845 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.705648 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.706480 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.706769 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.707007 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.707183 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.707388 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.703437 4745 patch_prober.go:28] interesting pod/route-controller-manager-687d544d6d-7tl6q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.703859 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.711051 4745 generic.go:334] "Generic (PLEG): container finished" podID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerID="117f361e74f1b08965125d9bb35ef740e9e3e9deac5263171a3478a453bec6f6" exitCode=0 Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.711118 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgsn7" event={"ID":"09d29a41-94df-42b0-b7d3-6b47b06a238f","Type":"ContainerDied","Data":"117f361e74f1b08965125d9bb35ef740e9e3e9deac5263171a3478a453bec6f6"} Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.712466 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e"} Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.713308 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.713643 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.713979 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.714278 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.714729 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.715605 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.787368 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.788009 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.788577 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.789202 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.789730 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.790014 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.885106 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-catalog-content\") pod \"09d29a41-94df-42b0-b7d3-6b47b06a238f\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.885178 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-utilities\") pod \"09d29a41-94df-42b0-b7d3-6b47b06a238f\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.885224 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkp49\" (UniqueName: \"kubernetes.io/projected/09d29a41-94df-42b0-b7d3-6b47b06a238f-kube-api-access-rkp49\") pod \"09d29a41-94df-42b0-b7d3-6b47b06a238f\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.886222 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-utilities" (OuterVolumeSpecName: "utilities") pod "09d29a41-94df-42b0-b7d3-6b47b06a238f" (UID: "09d29a41-94df-42b0-b7d3-6b47b06a238f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.891337 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09d29a41-94df-42b0-b7d3-6b47b06a238f-kube-api-access-rkp49" (OuterVolumeSpecName: "kube-api-access-rkp49") pod "09d29a41-94df-42b0-b7d3-6b47b06a238f" (UID: "09d29a41-94df-42b0-b7d3-6b47b06a238f"). InnerVolumeSpecName "kube-api-access-rkp49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.936366 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09d29a41-94df-42b0-b7d3-6b47b06a238f" (UID: "09d29a41-94df-42b0-b7d3-6b47b06a238f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.987108 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.987429 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.987444 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkp49\" (UniqueName: \"kubernetes.io/projected/09d29a41-94df-42b0-b7d3-6b47b06a238f-kube-api-access-rkp49\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.089346 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.089786 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.090181 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.090553 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.090755 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.090988 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.188728 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb131d37-4be2-4843-9ed6-21fc0636b07f-kube-api-access\") pod \"cb131d37-4be2-4843-9ed6-21fc0636b07f\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.188822 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-kubelet-dir\") pod \"cb131d37-4be2-4843-9ed6-21fc0636b07f\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.188871 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-var-lock\") pod \"cb131d37-4be2-4843-9ed6-21fc0636b07f\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.188975 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cb131d37-4be2-4843-9ed6-21fc0636b07f" (UID: "cb131d37-4be2-4843-9ed6-21fc0636b07f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.189051 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-var-lock" (OuterVolumeSpecName: "var-lock") pod "cb131d37-4be2-4843-9ed6-21fc0636b07f" (UID: "cb131d37-4be2-4843-9ed6-21fc0636b07f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.189148 4745 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-var-lock\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.189167 4745 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.192534 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb131d37-4be2-4843-9ed6-21fc0636b07f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cb131d37-4be2-4843-9ed6-21fc0636b07f" (UID: "cb131d37-4be2-4843-9ed6-21fc0636b07f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.290545 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb131d37-4be2-4843-9ed6-21fc0636b07f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.717168 4745 patch_prober.go:28] interesting pod/route-controller-manager-687d544d6d-7tl6q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.717562 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.724973 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.725759 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6" exitCode=0 Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.728397 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgsn7" event={"ID":"09d29a41-94df-42b0-b7d3-6b47b06a238f","Type":"ContainerDied","Data":"9abf27759067afc6e0e47fd16a9553d00f8c09a95f77720200901bcf6c2854bb"} Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.728504 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.728504 4745 scope.go:117] "RemoveContainer" containerID="117f361e74f1b08965125d9bb35ef740e9e3e9deac5263171a3478a453bec6f6" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.729331 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.729608 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.729846 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.730195 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.730570 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.730749 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cb131d37-4be2-4843-9ed6-21fc0636b07f","Type":"ContainerDied","Data":"0687eea174b3e10ed9c2288a4a55fbe08d67199218b910f91a8f0259a91524d7"} Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.730803 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.730807 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0687eea174b3e10ed9c2288a4a55fbe08d67199218b910f91a8f0259a91524d7" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.735617 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.736607 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.737214 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.737447 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.737677 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.737974 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.738375 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.738802 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.743820 4745 scope.go:117] "RemoveContainer" containerID="0f78ef6f7929f92b1cadbd575e901ee75c29f7405f9ec002e10bc0dc0774c85c" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.745818 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.746317 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.746697 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.747093 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.747376 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.747609 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.747966 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.748243 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.748521 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.748814 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.749215 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.749472 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.759052 4745 scope.go:117] "RemoveContainer" containerID="3c0dd7e0c251e39bd78fdfc535f458fe29dccacfeda18a0fdd0fe102becb3d5f" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.897101 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.897153 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.897209 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.897533 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.897571 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.897592 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.998607 4745 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.998636 4745 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.998644 4745 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.141227 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.141485 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.141637 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.141778 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.141965 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.142108 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.147142 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.742312 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.743161 4745 scope.go:117] "RemoveContainer" containerID="9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.743341 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.744290 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.744608 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.745189 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.745943 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.746209 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.746577 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.747057 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.747495 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.747813 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.748274 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.748560 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.748820 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.769223 4745 scope.go:117] "RemoveContainer" containerID="6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.786783 4745 scope.go:117] "RemoveContainer" containerID="dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.800218 4745 scope.go:117] "RemoveContainer" containerID="1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.815784 4745 scope.go:117] "RemoveContainer" containerID="b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.837688 4745 scope.go:117] "RemoveContainer" containerID="a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1" Mar 19 00:12:17 crc kubenswrapper[4745]: E0319 00:12:17.159428 4745 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{route-controller-manager-687d544d6d-7tl6q.189e15ac568c56f4 openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-687d544d6d-7tl6q,UID:7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d,APIVersion:v1,ResourceVersion:30018,FieldPath:spec.containers{route-controller-manager},},Reason:Created,Message:Created container route-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:12:13.161150196 +0000 UTC m=+297.699345327,LastTimestamp:2026-03-19 00:12:13.161150196 +0000 UTC m=+297.699345327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:12:19 crc kubenswrapper[4745]: E0319 00:12:19.488936 4745 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:19 crc kubenswrapper[4745]: E0319 00:12:19.489725 4745 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:19 crc kubenswrapper[4745]: E0319 00:12:19.490294 4745 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:19 crc kubenswrapper[4745]: E0319 00:12:19.490688 4745 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:19 crc kubenswrapper[4745]: E0319 00:12:19.490993 4745 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:19 crc kubenswrapper[4745]: I0319 00:12:19.491026 4745 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 00:12:19 crc kubenswrapper[4745]: E0319 00:12:19.491379 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="200ms" Mar 19 00:12:19 crc kubenswrapper[4745]: E0319 00:12:19.692874 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="400ms" Mar 19 00:12:20 crc kubenswrapper[4745]: E0319 00:12:20.093754 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="800ms" Mar 19 00:12:20 crc kubenswrapper[4745]: E0319 00:12:20.894931 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="1.6s" Mar 19 00:12:22 crc kubenswrapper[4745]: E0319 00:12:22.175187 4745 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" volumeName="registry-storage" Mar 19 00:12:22 crc kubenswrapper[4745]: E0319 00:12:22.496279 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="3.2s" Mar 19 00:12:22 crc kubenswrapper[4745]: I0319 00:12:22.865948 4745 patch_prober.go:28] interesting pod/route-controller-manager-687d544d6d-7tl6q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 00:12:22 crc kubenswrapper[4745]: I0319 00:12:22.866010 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 00:12:25 crc kubenswrapper[4745]: E0319 00:12:25.698031 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="6.4s" Mar 19 00:12:26 crc kubenswrapper[4745]: I0319 00:12:26.141676 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:26 crc kubenswrapper[4745]: I0319 00:12:26.142430 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:26 crc kubenswrapper[4745]: I0319 00:12:26.142844 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:26 crc kubenswrapper[4745]: I0319 00:12:26.143290 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:26 crc kubenswrapper[4745]: I0319 00:12:26.144008 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.137086 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.137926 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.138427 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.138702 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.139382 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.139719 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.158146 4745 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bd7a8e96-6428-45d1-90bf-0e26563710a7" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.158504 4745 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bd7a8e96-6428-45d1-90bf-0e26563710a7" Mar 19 00:12:27 crc kubenswrapper[4745]: E0319 00:12:27.159068 4745 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.159637 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:27 crc kubenswrapper[4745]: E0319 00:12:27.160041 4745 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{route-controller-manager-687d544d6d-7tl6q.189e15ac568c56f4 openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-687d544d6d-7tl6q,UID:7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d,APIVersion:v1,ResourceVersion:30018,FieldPath:spec.containers{route-controller-manager},},Reason:Created,Message:Created container route-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:12:13.161150196 +0000 UTC m=+297.699345327,LastTimestamp:2026-03-19 00:12:13.161150196 +0000 UTC m=+297.699345327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:12:27 crc kubenswrapper[4745]: W0319 00:12:27.196369 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-906998f27c7e09e4a2e06afeeea4c6a74a6a887a1235b39719ed19e7be002146 WatchSource:0}: Error finding container 906998f27c7e09e4a2e06afeeea4c6a74a6a887a1235b39719ed19e7be002146: Status 404 returned error can't find the container with id 906998f27c7e09e4a2e06afeeea4c6a74a6a887a1235b39719ed19e7be002146 Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.808067 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.809074 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.809328 4745 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb" exitCode=1 Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.809398 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb"} Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.810872 4745 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.811125 4745 scope.go:117] "RemoveContainer" containerID="0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.811265 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.811143 4745 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="ee7c6817aea68de19f15cb78d4e294a5507dc112393300176d86df7160f985ea" exitCode=0 Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.811170 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"ee7c6817aea68de19f15cb78d4e294a5507dc112393300176d86df7160f985ea"} Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.811475 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"906998f27c7e09e4a2e06afeeea4c6a74a6a887a1235b39719ed19e7be002146"} Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.811812 4745 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bd7a8e96-6428-45d1-90bf-0e26563710a7" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.811836 4745 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bd7a8e96-6428-45d1-90bf-0e26563710a7" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.811919 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: E0319 00:12:27.812274 4745 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.812409 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.813033 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.813284 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.813669 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.814127 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.814577 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.815059 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.815614 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.816123 4745 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.981400 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:12:28 crc kubenswrapper[4745]: I0319 00:12:28.825514 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"02f97eec88725c668a3b4eae9c7207237aeb66449f48a325a7e6568b50e7da18"} Mar 19 00:12:28 crc kubenswrapper[4745]: I0319 00:12:28.826067 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4be914544e9953ce510ed2ad681f3ae31cb11387a94fcb7e4e94a4fac099aca7"} Mar 19 00:12:28 crc kubenswrapper[4745]: I0319 00:12:28.826084 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ea904d0e0bf3b380fa7e494ce26ef4a72587aed13e33ee323cc4a8808bcd4d8f"} Mar 19 00:12:28 crc kubenswrapper[4745]: I0319 00:12:28.826096 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c439e8c39490851b522803b1283e21d698b19152abfe504e39a3d05254dae0ee"} Mar 19 00:12:28 crc kubenswrapper[4745]: I0319 00:12:28.834790 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 00:12:28 crc kubenswrapper[4745]: I0319 00:12:28.835646 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 00:12:28 crc kubenswrapper[4745]: I0319 00:12:28.835777 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e170bd623e237b7d4480aa146ecbfaf91651cdb0a850dac4e09af99860e21ab4"} Mar 19 00:12:29 crc kubenswrapper[4745]: I0319 00:12:29.849400 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4ab527b0f206fadb762890f83bf5fd29549aefcca2298b8947f66a628f495ec7"} Mar 19 00:12:29 crc kubenswrapper[4745]: I0319 00:12:29.849768 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:29 crc kubenswrapper[4745]: I0319 00:12:29.850521 4745 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bd7a8e96-6428-45d1-90bf-0e26563710a7" Mar 19 00:12:29 crc kubenswrapper[4745]: I0319 00:12:29.850595 4745 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bd7a8e96-6428-45d1-90bf-0e26563710a7" Mar 19 00:12:32 crc kubenswrapper[4745]: I0319 00:12:32.160202 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:32 crc kubenswrapper[4745]: I0319 00:12:32.160530 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:32 crc kubenswrapper[4745]: I0319 00:12:32.165149 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:32 crc kubenswrapper[4745]: I0319 00:12:32.526702 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:12:32 crc kubenswrapper[4745]: I0319 00:12:32.865586 4745 patch_prober.go:28] interesting pod/route-controller-manager-687d544d6d-7tl6q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 00:12:32 crc kubenswrapper[4745]: I0319 00:12:32.865645 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 00:12:34 crc kubenswrapper[4745]: I0319 00:12:34.859574 4745 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:35 crc kubenswrapper[4745]: I0319 00:12:35.527377 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" podUID="d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" containerName="oauth-openshift" containerID="cri-o://5e97a389fffe604d654de61eb43dad886ba3f0e357ed64056f25bb2e0ae3b360" gracePeriod=15 Mar 19 00:12:35 crc kubenswrapper[4745]: I0319 00:12:35.880605 4745 generic.go:334] "Generic (PLEG): container finished" podID="d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" containerID="5e97a389fffe604d654de61eb43dad886ba3f0e357ed64056f25bb2e0ae3b360" exitCode=0 Mar 19 00:12:35 crc kubenswrapper[4745]: I0319 00:12:35.880688 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" event={"ID":"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4","Type":"ContainerDied","Data":"5e97a389fffe604d654de61eb43dad886ba3f0e357ed64056f25bb2e0ae3b360"} Mar 19 00:12:35 crc kubenswrapper[4745]: I0319 00:12:35.881277 4745 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bd7a8e96-6428-45d1-90bf-0e26563710a7" Mar 19 00:12:35 crc kubenswrapper[4745]: I0319 00:12:35.881294 4745 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bd7a8e96-6428-45d1-90bf-0e26563710a7" Mar 19 00:12:35 crc kubenswrapper[4745]: I0319 00:12:35.884929 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:35 crc kubenswrapper[4745]: I0319 00:12:35.960034 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037245 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-provider-selection\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037296 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-policies\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037326 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-error\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037351 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-router-certs\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037387 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88wjz\" (UniqueName: \"kubernetes.io/projected/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-kube-api-access-88wjz\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037428 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-trusted-ca-bundle\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037454 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-login\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037476 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-serving-cert\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037496 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-session\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037522 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-idp-0-file-data\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037548 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-ocp-branding-template\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037574 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-service-ca\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037596 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-dir\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037618 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-cliconfig\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.038602 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.038658 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.039107 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.039742 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.040267 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.044947 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.045056 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-kube-api-access-88wjz" (OuterVolumeSpecName: "kube-api-access-88wjz") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "kube-api-access-88wjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.045143 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.045375 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.045495 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.045611 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.046034 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.046043 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.046256 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138246 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138278 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138290 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88wjz\" (UniqueName: \"kubernetes.io/projected/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-kube-api-access-88wjz\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138300 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138310 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138319 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138328 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138337 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138346 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138354 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138363 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138372 4745 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138382 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138391 4745 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.171383 4745 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4e655125-59af-476c-8508-2b9550782d73" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.888343 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" event={"ID":"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4","Type":"ContainerDied","Data":"6ed318766e28a7953b973ede1884a1baaf2b2983a95f1b27491abc323fc1f4ae"} Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.888401 4745 scope.go:117] "RemoveContainer" containerID="5e97a389fffe604d654de61eb43dad886ba3f0e357ed64056f25bb2e0ae3b360" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.888411 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.888561 4745 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bd7a8e96-6428-45d1-90bf-0e26563710a7" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.888586 4745 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bd7a8e96-6428-45d1-90bf-0e26563710a7" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.895198 4745 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4e655125-59af-476c-8508-2b9550782d73" Mar 19 00:12:37 crc kubenswrapper[4745]: I0319 00:12:37.981564 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:12:37 crc kubenswrapper[4745]: I0319 00:12:37.986010 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:12:38 crc kubenswrapper[4745]: I0319 00:12:38.902942 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:12:42 crc kubenswrapper[4745]: I0319 00:12:42.866226 4745 patch_prober.go:28] interesting pod/route-controller-manager-687d544d6d-7tl6q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 00:12:42 crc kubenswrapper[4745]: I0319 00:12:42.866623 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 00:12:44 crc kubenswrapper[4745]: I0319 00:12:44.587388 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 00:12:44 crc kubenswrapper[4745]: I0319 00:12:44.937630 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-687d544d6d-7tl6q_7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d/route-controller-manager/0.log" Mar 19 00:12:44 crc kubenswrapper[4745]: I0319 00:12:44.937691 4745 generic.go:334] "Generic (PLEG): container finished" podID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" containerID="ac5267107b617a38d4d92ce01c7ea3aff52496bb39844e8c8ebed6d6e23fd177" exitCode=255 Mar 19 00:12:44 crc kubenswrapper[4745]: I0319 00:12:44.937727 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" event={"ID":"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d","Type":"ContainerDied","Data":"ac5267107b617a38d4d92ce01c7ea3aff52496bb39844e8c8ebed6d6e23fd177"} Mar 19 00:12:44 crc kubenswrapper[4745]: I0319 00:12:44.938259 4745 scope.go:117] "RemoveContainer" containerID="ac5267107b617a38d4d92ce01c7ea3aff52496bb39844e8c8ebed6d6e23fd177" Mar 19 00:12:45 crc kubenswrapper[4745]: I0319 00:12:45.085174 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 00:12:45 crc kubenswrapper[4745]: I0319 00:12:45.556083 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 19 00:12:45 crc kubenswrapper[4745]: I0319 00:12:45.618469 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 00:12:45 crc kubenswrapper[4745]: I0319 00:12:45.730975 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 19 00:12:45 crc kubenswrapper[4745]: I0319 00:12:45.805525 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 00:12:45 crc kubenswrapper[4745]: I0319 00:12:45.944610 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-687d544d6d-7tl6q_7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d/route-controller-manager/0.log" Mar 19 00:12:45 crc kubenswrapper[4745]: I0319 00:12:45.944894 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" event={"ID":"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d","Type":"ContainerStarted","Data":"5ab9c0a2bf10341d7966c87c620df68f7517e654098d7f145806f21deb455dbf"} Mar 19 00:12:45 crc kubenswrapper[4745]: I0319 00:12:45.945289 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:45 crc kubenswrapper[4745]: I0319 00:12:45.969147 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 00:12:46 crc kubenswrapper[4745]: I0319 00:12:46.104905 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 00:12:46 crc kubenswrapper[4745]: I0319 00:12:46.171903 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 00:12:46 crc kubenswrapper[4745]: I0319 00:12:46.209493 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 00:12:46 crc kubenswrapper[4745]: I0319 00:12:46.243464 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 00:12:46 crc kubenswrapper[4745]: I0319 00:12:46.435010 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 19 00:12:46 crc kubenswrapper[4745]: I0319 00:12:46.513563 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 00:12:46 crc kubenswrapper[4745]: I0319 00:12:46.828425 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 00:12:46 crc kubenswrapper[4745]: I0319 00:12:46.945731 4745 patch_prober.go:28] interesting pod/route-controller-manager-687d544d6d-7tl6q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 00:12:46 crc kubenswrapper[4745]: I0319 00:12:46.945826 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.228530 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.373758 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.399457 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.441807 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.513495 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.534348 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.549323 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.674666 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.735515 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.825037 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.830585 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.884397 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.920078 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.950280 4745 patch_prober.go:28] interesting pod/route-controller-manager-687d544d6d-7tl6q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.950585 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.983784 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.997996 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.059855 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.103085 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.127015 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.170260 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.213862 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.269032 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.270590 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.286562 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.391617 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.586012 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.587261 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.601663 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.689488 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.722315 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.760088 4745 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.760519 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=36.760501035 podStartE2EDuration="36.760501035s" podCreationTimestamp="2026-03-19 00:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:12:34.544287788 +0000 UTC m=+319.082482939" watchObservedRunningTime="2026-03-19 00:12:48.760501035 +0000 UTC m=+333.298696166" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.762396 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" podStartSLOduration=41.762385585 podStartE2EDuration="41.762385585s" podCreationTimestamp="2026-03-19 00:12:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:12:34.598988985 +0000 UTC m=+319.137184136" watchObservedRunningTime="2026-03-19 00:12:48.762385585 +0000 UTC m=+333.300580716" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.763991 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.764854 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-522nc","openshift-marketplace/community-operators-kgsn7"] Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.764979 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.768619 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.780828 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.780810087999999 podStartE2EDuration="14.780810088s" podCreationTimestamp="2026-03-19 00:12:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:12:48.780421336 +0000 UTC m=+333.318616467" watchObservedRunningTime="2026-03-19 00:12:48.780810088 +0000 UTC m=+333.319005219" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.802247 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.808244 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.834587 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.947795 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.025643 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.037091 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.125169 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.227592 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.449818 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.576171 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.622920 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.622956 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.663090 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.804725 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.865376 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.882989 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.908664 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.921058 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.937583 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.001323 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.045632 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.045903 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.125825 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.144620 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" path="/var/lib/kubelet/pods/09d29a41-94df-42b0-b7d3-6b47b06a238f/volumes" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.145325 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" path="/var/lib/kubelet/pods/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4/volumes" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.156405 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.212371 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.227174 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.250125 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.349119 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.364286 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.370828 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.499381 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.567604 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.667001 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.698153 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.854064 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.908842 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.961453 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.034511 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.063268 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.087602 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.173262 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.192323 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.194153 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.277281 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.327379 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.397128 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.458134 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.486759 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.492406 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.497822 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.555873 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.642791 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.660702 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.707995 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.775737 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.825391 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.895112 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.895170 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.949032 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.951208 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.964254 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.993347 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.096600 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.143197 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.149685 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.160361 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.164019 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.201711 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.310736 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.329740 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.333596 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.494997 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.503383 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.555261 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.640538 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.647574 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.690315 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.745442 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.751875 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.842678 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.842846 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.867793 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.902495 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.927297 4745 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.038237 4745 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.177732 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.197742 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.243418 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.284576 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.329776 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.467556 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.518953 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.596131 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.614683 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.623970 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.656874 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.687474 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.833921 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.936936 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.071406 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.174802 4745 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.188754 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.233221 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.259953 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.521600 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.541327 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.541463 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.707696 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.769793 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.802027 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.893939 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.037763 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.136007 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.163133 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.241579 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.269639 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.290466 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.291301 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.297634 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.362973 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.391239 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.406950 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.436722 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.459516 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.463915 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.537773 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.537874 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.550819 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.767843 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.980815 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.073681 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.078592 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.099092 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.306098 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.390102 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.418002 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.686644 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.758717 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.768346 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.801393 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.836998 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.915512 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.967757 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.973963 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.114625 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.235192 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.288476 4745 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.289014 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e" gracePeriod=5 Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.309592 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.332256 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.377155 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.407003 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.437341 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.488962 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.545684 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.553856 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.580283 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.677912 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.716979 4745 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.808335 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.818144 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.826664 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.917535 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.020495 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.087858 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.149283 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.150065 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.166319 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.282838 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.304933 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.335137 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.372590 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.415317 4745 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.468666 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.470120 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593481 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5fc96db957-9rmlw"] Mar 19 00:12:58 crc kubenswrapper[4745]: E0319 00:12:58.593695 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593706 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 00:12:58 crc kubenswrapper[4745]: E0319 00:12:58.593714 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerName="extract-content" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593722 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerName="extract-content" Mar 19 00:12:58 crc kubenswrapper[4745]: E0319 00:12:58.593731 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerName="registry-server" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593737 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerName="registry-server" Mar 19 00:12:58 crc kubenswrapper[4745]: E0319 00:12:58.593746 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" containerName="oauth-openshift" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593753 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" containerName="oauth-openshift" Mar 19 00:12:58 crc kubenswrapper[4745]: E0319 00:12:58.593767 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerName="extract-utilities" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593774 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerName="extract-utilities" Mar 19 00:12:58 crc kubenswrapper[4745]: E0319 00:12:58.593785 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" containerName="installer" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593791 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" containerName="installer" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593870 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerName="registry-server" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593897 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" containerName="oauth-openshift" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593912 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593926 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" containerName="installer" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.594276 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.597588 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.597612 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.597696 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.598571 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.599668 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.602440 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.602569 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.602675 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.602732 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.603484 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.604570 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.605115 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.605133 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.611672 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.616070 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.618129 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.619917 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5fc96db957-9rmlw"] Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.699804 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.699857 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.699997 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700022 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-template-error\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700056 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-template-login\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700129 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-audit-policies\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700172 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eda29b61-f024-422b-8558-d7d5c4ef1bfa-audit-dir\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700210 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700275 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700317 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700350 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700369 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700384 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb2zt\" (UniqueName: \"kubernetes.io/projected/eda29b61-f024-422b-8558-d7d5c4ef1bfa-kube-api-access-wb2zt\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700407 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-session\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.715631 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.801731 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eda29b61-f024-422b-8558-d7d5c4ef1bfa-audit-dir\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.801794 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.801831 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.801854 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.801894 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.801916 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.801933 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb2zt\" (UniqueName: \"kubernetes.io/projected/eda29b61-f024-422b-8558-d7d5c4ef1bfa-kube-api-access-wb2zt\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.801948 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-session\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.801973 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.801992 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.802023 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.802043 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-template-error\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.802062 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-template-login\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.802079 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-audit-policies\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.802828 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-audit-policies\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.802903 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eda29b61-f024-422b-8558-d7d5c4ef1bfa-audit-dir\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.803310 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.803712 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.804275 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.805782 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.810143 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-template-error\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.810224 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.811011 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.815726 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.819698 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.820733 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-session\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.821098 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb2zt\" (UniqueName: \"kubernetes.io/projected/eda29b61-f024-422b-8558-d7d5c4ef1bfa-kube-api-access-wb2zt\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.823058 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.827626 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-template-login\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.841448 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.911698 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.079393 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.195676 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.196371 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.335189 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.356094 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5fc96db957-9rmlw"] Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.386692 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.444354 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.471400 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.505981 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.506554 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.566553 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.970922 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 19 00:13:00 crc kubenswrapper[4745]: I0319 00:13:00.023825 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" event={"ID":"eda29b61-f024-422b-8558-d7d5c4ef1bfa","Type":"ContainerStarted","Data":"fa8e408cd9744dc60a1a7adbf7b77f161a7b290d421d84041b64bd7e4d2f3d5f"} Mar 19 00:13:00 crc kubenswrapper[4745]: I0319 00:13:00.024191 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:13:00 crc kubenswrapper[4745]: I0319 00:13:00.024287 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" event={"ID":"eda29b61-f024-422b-8558-d7d5c4ef1bfa","Type":"ContainerStarted","Data":"c550440c425072db1c1668f1c3846d3b673ba2c62b460a530f56cd7b2897d765"} Mar 19 00:13:00 crc kubenswrapper[4745]: I0319 00:13:00.030585 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:13:00 crc kubenswrapper[4745]: I0319 00:13:00.047906 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" podStartSLOduration=50.047866728 podStartE2EDuration="50.047866728s" podCreationTimestamp="2026-03-19 00:12:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:13:00.043709697 +0000 UTC m=+344.581904858" watchObservedRunningTime="2026-03-19 00:13:00.047866728 +0000 UTC m=+344.586061869" Mar 19 00:13:00 crc kubenswrapper[4745]: I0319 00:13:00.452921 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 00:13:00 crc kubenswrapper[4745]: I0319 00:13:00.603499 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 00:13:00 crc kubenswrapper[4745]: I0319 00:13:00.611352 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 00:13:01 crc kubenswrapper[4745]: I0319 00:13:01.422772 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.876560 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.877006 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.958172 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.958561 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.958684 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.958805 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.958957 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.958704 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.958758 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.958856 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.959041 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.959447 4745 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.959528 4745 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.959639 4745 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.959722 4745 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.964396 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:13:03 crc kubenswrapper[4745]: I0319 00:13:03.043988 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 19 00:13:03 crc kubenswrapper[4745]: I0319 00:13:03.044056 4745 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e" exitCode=137 Mar 19 00:13:03 crc kubenswrapper[4745]: I0319 00:13:03.044108 4745 scope.go:117] "RemoveContainer" containerID="01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e" Mar 19 00:13:03 crc kubenswrapper[4745]: I0319 00:13:03.044134 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:13:03 crc kubenswrapper[4745]: I0319 00:13:03.060770 4745 scope.go:117] "RemoveContainer" containerID="01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e" Mar 19 00:13:03 crc kubenswrapper[4745]: E0319 00:13:03.061194 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e\": container with ID starting with 01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e not found: ID does not exist" containerID="01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e" Mar 19 00:13:03 crc kubenswrapper[4745]: I0319 00:13:03.061234 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e"} err="failed to get container status \"01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e\": rpc error: code = NotFound desc = could not find container \"01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e\": container with ID starting with 01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e not found: ID does not exist" Mar 19 00:13:03 crc kubenswrapper[4745]: I0319 00:13:03.061343 4745 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:04 crc kubenswrapper[4745]: I0319 00:13:04.144662 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 19 00:13:04 crc kubenswrapper[4745]: I0319 00:13:04.145180 4745 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 19 00:13:04 crc kubenswrapper[4745]: I0319 00:13:04.154125 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 00:13:04 crc kubenswrapper[4745]: I0319 00:13:04.154171 4745 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c30049d6-78af-46f5-b93d-7a74c53cbc3a" Mar 19 00:13:04 crc kubenswrapper[4745]: I0319 00:13:04.157766 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 00:13:04 crc kubenswrapper[4745]: I0319 00:13:04.157813 4745 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c30049d6-78af-46f5-b93d-7a74c53cbc3a" Mar 19 00:13:07 crc kubenswrapper[4745]: I0319 00:13:07.645920 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 00:13:18 crc kubenswrapper[4745]: I0319 00:13:18.130726 4745 generic.go:334] "Generic (PLEG): container finished" podID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerID="afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8" exitCode=0 Mar 19 00:13:18 crc kubenswrapper[4745]: I0319 00:13:18.130786 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" event={"ID":"e2cfb22a-6632-4e35-8145-6e9815e6e76f","Type":"ContainerDied","Data":"afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8"} Mar 19 00:13:18 crc kubenswrapper[4745]: I0319 00:13:18.131635 4745 scope.go:117] "RemoveContainer" containerID="afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8" Mar 19 00:13:19 crc kubenswrapper[4745]: I0319 00:13:19.139935 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" event={"ID":"e2cfb22a-6632-4e35-8145-6e9815e6e76f","Type":"ContainerStarted","Data":"1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243"} Mar 19 00:13:19 crc kubenswrapper[4745]: I0319 00:13:19.141208 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:13:19 crc kubenswrapper[4745]: I0319 00:13:19.143468 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:13:42 crc kubenswrapper[4745]: I0319 00:13:42.300011 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wf9ss"] Mar 19 00:13:42 crc kubenswrapper[4745]: I0319 00:13:42.300820 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wf9ss" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerName="registry-server" containerID="cri-o://71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba" gracePeriod=2 Mar 19 00:13:42 crc kubenswrapper[4745]: I0319 00:13:42.806858 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:13:42 crc kubenswrapper[4745]: I0319 00:13:42.975712 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-catalog-content\") pod \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " Mar 19 00:13:42 crc kubenswrapper[4745]: I0319 00:13:42.975770 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-utilities\") pod \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " Mar 19 00:13:42 crc kubenswrapper[4745]: I0319 00:13:42.975822 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sq28\" (UniqueName: \"kubernetes.io/projected/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-kube-api-access-8sq28\") pod \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " Mar 19 00:13:42 crc kubenswrapper[4745]: I0319 00:13:42.978700 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-utilities" (OuterVolumeSpecName: "utilities") pod "04cc89b5-7bac-4b91-bb97-a1f5ab14260c" (UID: "04cc89b5-7bac-4b91-bb97-a1f5ab14260c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:13:42 crc kubenswrapper[4745]: I0319 00:13:42.982938 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-kube-api-access-8sq28" (OuterVolumeSpecName: "kube-api-access-8sq28") pod "04cc89b5-7bac-4b91-bb97-a1f5ab14260c" (UID: "04cc89b5-7bac-4b91-bb97-a1f5ab14260c"). InnerVolumeSpecName "kube-api-access-8sq28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.004759 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04cc89b5-7bac-4b91-bb97-a1f5ab14260c" (UID: "04cc89b5-7bac-4b91-bb97-a1f5ab14260c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.077127 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.077159 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.077169 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sq28\" (UniqueName: \"kubernetes.io/projected/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-kube-api-access-8sq28\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.265266 4745 generic.go:334] "Generic (PLEG): container finished" podID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerID="71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba" exitCode=0 Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.265358 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.265368 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wf9ss" event={"ID":"04cc89b5-7bac-4b91-bb97-a1f5ab14260c","Type":"ContainerDied","Data":"71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba"} Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.265743 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wf9ss" event={"ID":"04cc89b5-7bac-4b91-bb97-a1f5ab14260c","Type":"ContainerDied","Data":"9d959c62c69d3a59b0e990d8806f87c5f90630c2ae5bf2656f164c9a4cd2a4ac"} Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.265767 4745 scope.go:117] "RemoveContainer" containerID="71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.291765 4745 scope.go:117] "RemoveContainer" containerID="982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.304562 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wf9ss"] Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.311794 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wf9ss"] Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.314968 4745 scope.go:117] "RemoveContainer" containerID="6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.343392 4745 scope.go:117] "RemoveContainer" containerID="71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba" Mar 19 00:13:43 crc kubenswrapper[4745]: E0319 00:13:43.343812 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba\": container with ID starting with 71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba not found: ID does not exist" containerID="71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.343858 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba"} err="failed to get container status \"71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba\": rpc error: code = NotFound desc = could not find container \"71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba\": container with ID starting with 71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba not found: ID does not exist" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.343905 4745 scope.go:117] "RemoveContainer" containerID="982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5" Mar 19 00:13:43 crc kubenswrapper[4745]: E0319 00:13:43.344307 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5\": container with ID starting with 982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5 not found: ID does not exist" containerID="982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.344334 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5"} err="failed to get container status \"982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5\": rpc error: code = NotFound desc = could not find container \"982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5\": container with ID starting with 982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5 not found: ID does not exist" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.344353 4745 scope.go:117] "RemoveContainer" containerID="6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa" Mar 19 00:13:43 crc kubenswrapper[4745]: E0319 00:13:43.344530 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa\": container with ID starting with 6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa not found: ID does not exist" containerID="6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.344551 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa"} err="failed to get container status \"6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa\": rpc error: code = NotFound desc = could not find container \"6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa\": container with ID starting with 6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa not found: ID does not exist" Mar 19 00:13:44 crc kubenswrapper[4745]: I0319 00:13:44.144482 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" path="/var/lib/kubelet/pods/04cc89b5-7bac-4b91-bb97-a1f5ab14260c/volumes" Mar 19 00:13:55 crc kubenswrapper[4745]: I0319 00:13:55.499378 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cgghw"] Mar 19 00:13:55 crc kubenswrapper[4745]: I0319 00:13:55.500225 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cgghw" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerName="registry-server" containerID="cri-o://5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c" gracePeriod=2 Mar 19 00:13:55 crc kubenswrapper[4745]: I0319 00:13:55.870375 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.030938 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-utilities\") pod \"b19d4fad-672f-40f3-bfdb-53b36da06399\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.031432 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-catalog-content\") pod \"b19d4fad-672f-40f3-bfdb-53b36da06399\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.031469 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdzp9\" (UniqueName: \"kubernetes.io/projected/b19d4fad-672f-40f3-bfdb-53b36da06399-kube-api-access-mdzp9\") pod \"b19d4fad-672f-40f3-bfdb-53b36da06399\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.031972 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-utilities" (OuterVolumeSpecName: "utilities") pod "b19d4fad-672f-40f3-bfdb-53b36da06399" (UID: "b19d4fad-672f-40f3-bfdb-53b36da06399"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.038393 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b19d4fad-672f-40f3-bfdb-53b36da06399-kube-api-access-mdzp9" (OuterVolumeSpecName: "kube-api-access-mdzp9") pod "b19d4fad-672f-40f3-bfdb-53b36da06399" (UID: "b19d4fad-672f-40f3-bfdb-53b36da06399"). InnerVolumeSpecName "kube-api-access-mdzp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.133426 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.133466 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdzp9\" (UniqueName: \"kubernetes.io/projected/b19d4fad-672f-40f3-bfdb-53b36da06399-kube-api-access-mdzp9\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.167375 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b19d4fad-672f-40f3-bfdb-53b36da06399" (UID: "b19d4fad-672f-40f3-bfdb-53b36da06399"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.234981 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.330393 4745 generic.go:334] "Generic (PLEG): container finished" podID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerID="5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c" exitCode=0 Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.330441 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgghw" event={"ID":"b19d4fad-672f-40f3-bfdb-53b36da06399","Type":"ContainerDied","Data":"5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c"} Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.330455 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.330477 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgghw" event={"ID":"b19d4fad-672f-40f3-bfdb-53b36da06399","Type":"ContainerDied","Data":"a6f79e4dda71ca031bc419e937553d73dcdca6b9aad0060adce374c4237e4880"} Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.330499 4745 scope.go:117] "RemoveContainer" containerID="5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.345802 4745 scope.go:117] "RemoveContainer" containerID="f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.363022 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cgghw"] Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.367782 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cgghw"] Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.389608 4745 scope.go:117] "RemoveContainer" containerID="eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.406223 4745 scope.go:117] "RemoveContainer" containerID="5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c" Mar 19 00:13:56 crc kubenswrapper[4745]: E0319 00:13:56.407095 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c\": container with ID starting with 5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c not found: ID does not exist" containerID="5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.407141 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c"} err="failed to get container status \"5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c\": rpc error: code = NotFound desc = could not find container \"5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c\": container with ID starting with 5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c not found: ID does not exist" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.407169 4745 scope.go:117] "RemoveContainer" containerID="f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370" Mar 19 00:13:56 crc kubenswrapper[4745]: E0319 00:13:56.407460 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370\": container with ID starting with f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370 not found: ID does not exist" containerID="f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.407500 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370"} err="failed to get container status \"f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370\": rpc error: code = NotFound desc = could not find container \"f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370\": container with ID starting with f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370 not found: ID does not exist" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.407527 4745 scope.go:117] "RemoveContainer" containerID="eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6" Mar 19 00:13:56 crc kubenswrapper[4745]: E0319 00:13:56.407796 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6\": container with ID starting with eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6 not found: ID does not exist" containerID="eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.407820 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6"} err="failed to get container status \"eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6\": rpc error: code = NotFound desc = could not find container \"eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6\": container with ID starting with eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6 not found: ID does not exist" Mar 19 00:13:58 crc kubenswrapper[4745]: I0319 00:13:58.144537 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" path="/var/lib/kubelet/pods/b19d4fad-672f-40f3-bfdb-53b36da06399/volumes" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.164516 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564654-j2b7b"] Mar 19 00:14:00 crc kubenswrapper[4745]: E0319 00:14:00.165097 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerName="extract-utilities" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.165110 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerName="extract-utilities" Mar 19 00:14:00 crc kubenswrapper[4745]: E0319 00:14:00.165123 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerName="extract-utilities" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.165129 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerName="extract-utilities" Mar 19 00:14:00 crc kubenswrapper[4745]: E0319 00:14:00.165142 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerName="extract-content" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.165153 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerName="extract-content" Mar 19 00:14:00 crc kubenswrapper[4745]: E0319 00:14:00.165159 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerName="registry-server" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.165165 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerName="registry-server" Mar 19 00:14:00 crc kubenswrapper[4745]: E0319 00:14:00.165174 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerName="registry-server" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.165179 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerName="registry-server" Mar 19 00:14:00 crc kubenswrapper[4745]: E0319 00:14:00.165192 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerName="extract-content" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.165199 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerName="extract-content" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.165291 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerName="registry-server" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.165303 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerName="registry-server" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.165642 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564654-j2b7b" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.168471 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.168980 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.169343 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.170738 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564654-j2b7b"] Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.287202 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8tm7\" (UniqueName: \"kubernetes.io/projected/f7689b2b-3fcb-4122-bb50-fb8215cdb08b-kube-api-access-t8tm7\") pod \"auto-csr-approver-29564654-j2b7b\" (UID: \"f7689b2b-3fcb-4122-bb50-fb8215cdb08b\") " pod="openshift-infra/auto-csr-approver-29564654-j2b7b" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.388669 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8tm7\" (UniqueName: \"kubernetes.io/projected/f7689b2b-3fcb-4122-bb50-fb8215cdb08b-kube-api-access-t8tm7\") pod \"auto-csr-approver-29564654-j2b7b\" (UID: \"f7689b2b-3fcb-4122-bb50-fb8215cdb08b\") " pod="openshift-infra/auto-csr-approver-29564654-j2b7b" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.410668 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8tm7\" (UniqueName: \"kubernetes.io/projected/f7689b2b-3fcb-4122-bb50-fb8215cdb08b-kube-api-access-t8tm7\") pod \"auto-csr-approver-29564654-j2b7b\" (UID: \"f7689b2b-3fcb-4122-bb50-fb8215cdb08b\") " pod="openshift-infra/auto-csr-approver-29564654-j2b7b" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.481827 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564654-j2b7b" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.898430 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564654-j2b7b"] Mar 19 00:14:01 crc kubenswrapper[4745]: I0319 00:14:01.355871 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564654-j2b7b" event={"ID":"f7689b2b-3fcb-4122-bb50-fb8215cdb08b","Type":"ContainerStarted","Data":"4a1aa249e4c789f82a17ac8341e4377f3f91069620b3cc7aa8bf4600e94dcc7f"} Mar 19 00:14:03 crc kubenswrapper[4745]: I0319 00:14:03.373547 4745 generic.go:334] "Generic (PLEG): container finished" podID="f7689b2b-3fcb-4122-bb50-fb8215cdb08b" containerID="d0b7bf2fb29c7b89c86195effcef47a72d6e88d2457a53a6804ca521616f6ee6" exitCode=0 Mar 19 00:14:03 crc kubenswrapper[4745]: I0319 00:14:03.373685 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564654-j2b7b" event={"ID":"f7689b2b-3fcb-4122-bb50-fb8215cdb08b","Type":"ContainerDied","Data":"d0b7bf2fb29c7b89c86195effcef47a72d6e88d2457a53a6804ca521616f6ee6"} Mar 19 00:14:04 crc kubenswrapper[4745]: I0319 00:14:04.635049 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564654-j2b7b" Mar 19 00:14:04 crc kubenswrapper[4745]: I0319 00:14:04.734812 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8tm7\" (UniqueName: \"kubernetes.io/projected/f7689b2b-3fcb-4122-bb50-fb8215cdb08b-kube-api-access-t8tm7\") pod \"f7689b2b-3fcb-4122-bb50-fb8215cdb08b\" (UID: \"f7689b2b-3fcb-4122-bb50-fb8215cdb08b\") " Mar 19 00:14:04 crc kubenswrapper[4745]: I0319 00:14:04.741748 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7689b2b-3fcb-4122-bb50-fb8215cdb08b-kube-api-access-t8tm7" (OuterVolumeSpecName: "kube-api-access-t8tm7") pod "f7689b2b-3fcb-4122-bb50-fb8215cdb08b" (UID: "f7689b2b-3fcb-4122-bb50-fb8215cdb08b"). InnerVolumeSpecName "kube-api-access-t8tm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:14:04 crc kubenswrapper[4745]: I0319 00:14:04.836203 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8tm7\" (UniqueName: \"kubernetes.io/projected/f7689b2b-3fcb-4122-bb50-fb8215cdb08b-kube-api-access-t8tm7\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:05 crc kubenswrapper[4745]: I0319 00:14:05.384670 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564654-j2b7b" event={"ID":"f7689b2b-3fcb-4122-bb50-fb8215cdb08b","Type":"ContainerDied","Data":"4a1aa249e4c789f82a17ac8341e4377f3f91069620b3cc7aa8bf4600e94dcc7f"} Mar 19 00:14:05 crc kubenswrapper[4745]: I0319 00:14:05.384716 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a1aa249e4c789f82a17ac8341e4377f3f91069620b3cc7aa8bf4600e94dcc7f" Mar 19 00:14:05 crc kubenswrapper[4745]: I0319 00:14:05.384730 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564654-j2b7b" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.660721 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wrzf6"] Mar 19 00:14:11 crc kubenswrapper[4745]: E0319 00:14:11.662838 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7689b2b-3fcb-4122-bb50-fb8215cdb08b" containerName="oc" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.663053 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7689b2b-3fcb-4122-bb50-fb8215cdb08b" containerName="oc" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.663223 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7689b2b-3fcb-4122-bb50-fb8215cdb08b" containerName="oc" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.663941 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.678482 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wrzf6"] Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.819784 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/855c57ce-9fe1-4725-87e6-4b777c1fd55f-registry-tls\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.820343 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/855c57ce-9fe1-4725-87e6-4b777c1fd55f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.820501 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/855c57ce-9fe1-4725-87e6-4b777c1fd55f-bound-sa-token\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.820664 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/855c57ce-9fe1-4725-87e6-4b777c1fd55f-trusted-ca\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.820774 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/855c57ce-9fe1-4725-87e6-4b777c1fd55f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.820922 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.821080 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/855c57ce-9fe1-4725-87e6-4b777c1fd55f-registry-certificates\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.821195 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtjzj\" (UniqueName: \"kubernetes.io/projected/855c57ce-9fe1-4725-87e6-4b777c1fd55f-kube-api-access-gtjzj\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.844160 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.922621 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/855c57ce-9fe1-4725-87e6-4b777c1fd55f-trusted-ca\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.922710 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/855c57ce-9fe1-4725-87e6-4b777c1fd55f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.922741 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/855c57ce-9fe1-4725-87e6-4b777c1fd55f-registry-certificates\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.922765 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtjzj\" (UniqueName: \"kubernetes.io/projected/855c57ce-9fe1-4725-87e6-4b777c1fd55f-kube-api-access-gtjzj\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.922804 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/855c57ce-9fe1-4725-87e6-4b777c1fd55f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.922823 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/855c57ce-9fe1-4725-87e6-4b777c1fd55f-registry-tls\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.922854 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/855c57ce-9fe1-4725-87e6-4b777c1fd55f-bound-sa-token\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.923436 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/855c57ce-9fe1-4725-87e6-4b777c1fd55f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.924101 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/855c57ce-9fe1-4725-87e6-4b777c1fd55f-registry-certificates\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.924187 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/855c57ce-9fe1-4725-87e6-4b777c1fd55f-trusted-ca\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.928859 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/855c57ce-9fe1-4725-87e6-4b777c1fd55f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.929405 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/855c57ce-9fe1-4725-87e6-4b777c1fd55f-registry-tls\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.941774 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/855c57ce-9fe1-4725-87e6-4b777c1fd55f-bound-sa-token\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.941852 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtjzj\" (UniqueName: \"kubernetes.io/projected/855c57ce-9fe1-4725-87e6-4b777c1fd55f-kube-api-access-gtjzj\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.983457 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:12 crc kubenswrapper[4745]: I0319 00:14:12.181850 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wrzf6"] Mar 19 00:14:12 crc kubenswrapper[4745]: I0319 00:14:12.424690 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" event={"ID":"855c57ce-9fe1-4725-87e6-4b777c1fd55f","Type":"ContainerStarted","Data":"1738f7f83a17a1cb84f1a84e0a5256950186d622e0c9ce3ed4afb6f9f740b175"} Mar 19 00:14:12 crc kubenswrapper[4745]: I0319 00:14:12.425059 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:12 crc kubenswrapper[4745]: I0319 00:14:12.425071 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" event={"ID":"855c57ce-9fe1-4725-87e6-4b777c1fd55f","Type":"ContainerStarted","Data":"7ade38938661cc1417844e00b94825e429fabf80b6db6f36ee9c2ef82efca515"} Mar 19 00:14:12 crc kubenswrapper[4745]: I0319 00:14:12.445050 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" podStartSLOduration=1.445029202 podStartE2EDuration="1.445029202s" podCreationTimestamp="2026-03-19 00:14:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:14:12.442652207 +0000 UTC m=+416.980847358" watchObservedRunningTime="2026-03-19 00:14:12.445029202 +0000 UTC m=+416.983224333" Mar 19 00:14:15 crc kubenswrapper[4745]: I0319 00:14:15.606450 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:14:15 crc kubenswrapper[4745]: I0319 00:14:15.606742 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.622512 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hhfzg"] Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.623506 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hhfzg" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerName="registry-server" containerID="cri-o://0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9" gracePeriod=30 Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.632182 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q9zn6"] Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.632538 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q9zn6" podUID="c21b8175-025a-4d91-ad43-389dbad40846" containerName="registry-server" containerID="cri-o://1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f" gracePeriod=30 Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.637541 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qn8c4"] Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.637977 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" podUID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerName="marketplace-operator" containerID="cri-o://1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243" gracePeriod=30 Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.651106 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hcgn8"] Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.652398 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.654792 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtjq5"] Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.661386 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-75vmv"] Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.661717 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-75vmv" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerName="registry-server" containerID="cri-o://e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a" gracePeriod=30 Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.662275 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mtjq5" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerName="registry-server" containerID="cri-o://97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42" gracePeriod=30 Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.671914 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hcgn8"] Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.789326 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hcgn8\" (UID: \"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06\") " pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.789390 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lkcr\" (UniqueName: \"kubernetes.io/projected/9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06-kube-api-access-9lkcr\") pod \"marketplace-operator-79b997595-hcgn8\" (UID: \"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06\") " pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.789414 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hcgn8\" (UID: \"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06\") " pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.890861 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hcgn8\" (UID: \"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06\") " pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.890928 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lkcr\" (UniqueName: \"kubernetes.io/projected/9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06-kube-api-access-9lkcr\") pod \"marketplace-operator-79b997595-hcgn8\" (UID: \"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06\") " pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.890945 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hcgn8\" (UID: \"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06\") " pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.892584 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hcgn8\" (UID: \"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06\") " pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.898270 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hcgn8\" (UID: \"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06\") " pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.922741 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lkcr\" (UniqueName: \"kubernetes.io/projected/9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06-kube-api-access-9lkcr\") pod \"marketplace-operator-79b997595-hcgn8\" (UID: \"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06\") " pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.979481 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.127527 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.131219 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.133603 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.144917 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.145664 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.294921 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-trusted-ca\") pod \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.294978 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-operator-metrics\") pod \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295042 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdqmk\" (UniqueName: \"kubernetes.io/projected/c21b8175-025a-4d91-ad43-389dbad40846-kube-api-access-rdqmk\") pod \"c21b8175-025a-4d91-ad43-389dbad40846\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295060 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-catalog-content\") pod \"c21b8175-025a-4d91-ad43-389dbad40846\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295090 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-catalog-content\") pod \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295107 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-catalog-content\") pod \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295127 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-utilities\") pod \"2c3c406d-9994-4629-b585-4d145b1e04aa\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295141 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-utilities\") pod \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295158 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rttcj\" (UniqueName: \"kubernetes.io/projected/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-kube-api-access-rttcj\") pod \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295194 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-catalog-content\") pod \"2c3c406d-9994-4629-b585-4d145b1e04aa\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295231 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82wwh\" (UniqueName: \"kubernetes.io/projected/0c1d22d3-b584-4622-856c-b531a5d1ad5d-kube-api-access-82wwh\") pod \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295248 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhwtp\" (UniqueName: \"kubernetes.io/projected/e2cfb22a-6632-4e35-8145-6e9815e6e76f-kube-api-access-lhwtp\") pod \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295268 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-utilities\") pod \"c21b8175-025a-4d91-ad43-389dbad40846\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295287 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqj2f\" (UniqueName: \"kubernetes.io/projected/2c3c406d-9994-4629-b585-4d145b1e04aa-kube-api-access-sqj2f\") pod \"2c3c406d-9994-4629-b585-4d145b1e04aa\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295303 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-utilities\") pod \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.296739 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-utilities" (OuterVolumeSpecName: "utilities") pod "0c1d22d3-b584-4622-856c-b531a5d1ad5d" (UID: "0c1d22d3-b584-4622-856c-b531a5d1ad5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.297684 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-utilities" (OuterVolumeSpecName: "utilities") pod "71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" (UID: "71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.298220 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "e2cfb22a-6632-4e35-8145-6e9815e6e76f" (UID: "e2cfb22a-6632-4e35-8145-6e9815e6e76f"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.298723 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-utilities" (OuterVolumeSpecName: "utilities") pod "2c3c406d-9994-4629-b585-4d145b1e04aa" (UID: "2c3c406d-9994-4629-b585-4d145b1e04aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.299996 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-utilities" (OuterVolumeSpecName: "utilities") pod "c21b8175-025a-4d91-ad43-389dbad40846" (UID: "c21b8175-025a-4d91-ad43-389dbad40846"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.301601 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-kube-api-access-rttcj" (OuterVolumeSpecName: "kube-api-access-rttcj") pod "71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" (UID: "71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0"). InnerVolumeSpecName "kube-api-access-rttcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.302337 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c21b8175-025a-4d91-ad43-389dbad40846-kube-api-access-rdqmk" (OuterVolumeSpecName: "kube-api-access-rdqmk") pod "c21b8175-025a-4d91-ad43-389dbad40846" (UID: "c21b8175-025a-4d91-ad43-389dbad40846"). InnerVolumeSpecName "kube-api-access-rdqmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.303903 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2cfb22a-6632-4e35-8145-6e9815e6e76f-kube-api-access-lhwtp" (OuterVolumeSpecName: "kube-api-access-lhwtp") pod "e2cfb22a-6632-4e35-8145-6e9815e6e76f" (UID: "e2cfb22a-6632-4e35-8145-6e9815e6e76f"). InnerVolumeSpecName "kube-api-access-lhwtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.304536 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3c406d-9994-4629-b585-4d145b1e04aa-kube-api-access-sqj2f" (OuterVolumeSpecName: "kube-api-access-sqj2f") pod "2c3c406d-9994-4629-b585-4d145b1e04aa" (UID: "2c3c406d-9994-4629-b585-4d145b1e04aa"). InnerVolumeSpecName "kube-api-access-sqj2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.312537 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c1d22d3-b584-4622-856c-b531a5d1ad5d-kube-api-access-82wwh" (OuterVolumeSpecName: "kube-api-access-82wwh") pod "0c1d22d3-b584-4622-856c-b531a5d1ad5d" (UID: "0c1d22d3-b584-4622-856c-b531a5d1ad5d"). InnerVolumeSpecName "kube-api-access-82wwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.315562 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "e2cfb22a-6632-4e35-8145-6e9815e6e76f" (UID: "e2cfb22a-6632-4e35-8145-6e9815e6e76f"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.326569 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c3c406d-9994-4629-b585-4d145b1e04aa" (UID: "2c3c406d-9994-4629-b585-4d145b1e04aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.363660 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c1d22d3-b584-4622-856c-b531a5d1ad5d" (UID: "0c1d22d3-b584-4622-856c-b531a5d1ad5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.367543 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c21b8175-025a-4d91-ad43-389dbad40846" (UID: "c21b8175-025a-4d91-ad43-389dbad40846"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396540 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396597 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdqmk\" (UniqueName: \"kubernetes.io/projected/c21b8175-025a-4d91-ad43-389dbad40846-kube-api-access-rdqmk\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396613 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396625 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396637 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396647 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rttcj\" (UniqueName: \"kubernetes.io/projected/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-kube-api-access-rttcj\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396658 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396669 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82wwh\" (UniqueName: \"kubernetes.io/projected/0c1d22d3-b584-4622-856c-b531a5d1ad5d-kube-api-access-82wwh\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396681 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhwtp\" (UniqueName: \"kubernetes.io/projected/e2cfb22a-6632-4e35-8145-6e9815e6e76f-kube-api-access-lhwtp\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396691 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396701 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqj2f\" (UniqueName: \"kubernetes.io/projected/2c3c406d-9994-4629-b585-4d145b1e04aa-kube-api-access-sqj2f\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396711 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396723 4745 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396733 4745 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.442653 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" (UID: "71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.468237 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hcgn8"] Mar 19 00:14:24 crc kubenswrapper[4745]: W0319 00:14:24.472004 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dc2e0fe_a8bd_4a4f_9ee2_1685bc395d06.slice/crio-eee886943feebfac8ae303664dfd5c792fd6a0b6ca7a041aac08f33921ca9fc7 WatchSource:0}: Error finding container eee886943feebfac8ae303664dfd5c792fd6a0b6ca7a041aac08f33921ca9fc7: Status 404 returned error can't find the container with id eee886943feebfac8ae303664dfd5c792fd6a0b6ca7a041aac08f33921ca9fc7 Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.495198 4745 generic.go:334] "Generic (PLEG): container finished" podID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerID="0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9" exitCode=0 Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.495248 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhfzg" event={"ID":"0c1d22d3-b584-4622-856c-b531a5d1ad5d","Type":"ContainerDied","Data":"0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.495309 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhfzg" event={"ID":"0c1d22d3-b584-4622-856c-b531a5d1ad5d","Type":"ContainerDied","Data":"1385db0e9218cd6a53bd844f3c99f4797ac5eed30c3c8451117c54ef69a818d9"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.495327 4745 scope.go:117] "RemoveContainer" containerID="0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.495266 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.497044 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" event={"ID":"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06","Type":"ContainerStarted","Data":"eee886943feebfac8ae303664dfd5c792fd6a0b6ca7a041aac08f33921ca9fc7"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.497564 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.500097 4745 generic.go:334] "Generic (PLEG): container finished" podID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerID="1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243" exitCode=0 Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.500134 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.500152 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" event={"ID":"e2cfb22a-6632-4e35-8145-6e9815e6e76f","Type":"ContainerDied","Data":"1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.500412 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" event={"ID":"e2cfb22a-6632-4e35-8145-6e9815e6e76f","Type":"ContainerDied","Data":"1ba6e1f38cebd60b48f169bf9b16cb68b35cbd4232e7b7482a4e5339486334e0"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.506078 4745 generic.go:334] "Generic (PLEG): container finished" podID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerID="97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42" exitCode=0 Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.506164 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtjq5" event={"ID":"2c3c406d-9994-4629-b585-4d145b1e04aa","Type":"ContainerDied","Data":"97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.506269 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtjq5" event={"ID":"2c3c406d-9994-4629-b585-4d145b1e04aa","Type":"ContainerDied","Data":"3b83c823529db11222e49bc92d4be77a23b45eab02e962cb6761c2af219fd176"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.506312 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.508656 4745 generic.go:334] "Generic (PLEG): container finished" podID="c21b8175-025a-4d91-ad43-389dbad40846" containerID="1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f" exitCode=0 Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.508934 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9zn6" event={"ID":"c21b8175-025a-4d91-ad43-389dbad40846","Type":"ContainerDied","Data":"1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.508964 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9zn6" event={"ID":"c21b8175-025a-4d91-ad43-389dbad40846","Type":"ContainerDied","Data":"44402f88db283736632f4638c8a012502a669c0564707bb1cc601815d7a854a9"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.509021 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.513440 4745 scope.go:117] "RemoveContainer" containerID="abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.513495 4745 generic.go:334] "Generic (PLEG): container finished" podID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerID="e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a" exitCode=0 Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.513527 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75vmv" event={"ID":"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0","Type":"ContainerDied","Data":"e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.513553 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75vmv" event={"ID":"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0","Type":"ContainerDied","Data":"2cd3908399145e2519a565664cfaff071ac8bf459660c66c4bd6a1d4b7d2532a"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.513611 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.549907 4745 scope.go:117] "RemoveContainer" containerID="aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.585822 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hhfzg"] Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.585962 4745 scope.go:117] "RemoveContainer" containerID="0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.586385 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9\": container with ID starting with 0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9 not found: ID does not exist" containerID="0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.586433 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9"} err="failed to get container status \"0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9\": rpc error: code = NotFound desc = could not find container \"0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9\": container with ID starting with 0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9 not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.586466 4745 scope.go:117] "RemoveContainer" containerID="abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.586900 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3\": container with ID starting with abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3 not found: ID does not exist" containerID="abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.586926 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3"} err="failed to get container status \"abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3\": rpc error: code = NotFound desc = could not find container \"abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3\": container with ID starting with abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3 not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.586941 4745 scope.go:117] "RemoveContainer" containerID="aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.587470 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db\": container with ID starting with aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db not found: ID does not exist" containerID="aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.587625 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db"} err="failed to get container status \"aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db\": rpc error: code = NotFound desc = could not find container \"aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db\": container with ID starting with aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.587709 4745 scope.go:117] "RemoveContainer" containerID="1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.602806 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hhfzg"] Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.608166 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q9zn6"] Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.621217 4745 scope.go:117] "RemoveContainer" containerID="afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.633346 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q9zn6"] Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.652942 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qn8c4"] Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.664295 4745 scope.go:117] "RemoveContainer" containerID="1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.678536 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243\": container with ID starting with 1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243 not found: ID does not exist" containerID="1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.678621 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243"} err="failed to get container status \"1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243\": rpc error: code = NotFound desc = could not find container \"1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243\": container with ID starting with 1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243 not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.678655 4745 scope.go:117] "RemoveContainer" containerID="afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.679180 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8\": container with ID starting with afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8 not found: ID does not exist" containerID="afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.679209 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8"} err="failed to get container status \"afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8\": rpc error: code = NotFound desc = could not find container \"afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8\": container with ID starting with afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8 not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.679239 4745 scope.go:117] "RemoveContainer" containerID="97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.705976 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qn8c4"] Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.711980 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtjq5"] Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.716614 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtjq5"] Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.722157 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-75vmv"] Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.725503 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-75vmv"] Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.726980 4745 scope.go:117] "RemoveContainer" containerID="6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.742576 4745 scope.go:117] "RemoveContainer" containerID="595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.760497 4745 scope.go:117] "RemoveContainer" containerID="97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.761179 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42\": container with ID starting with 97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42 not found: ID does not exist" containerID="97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.761216 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42"} err="failed to get container status \"97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42\": rpc error: code = NotFound desc = could not find container \"97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42\": container with ID starting with 97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42 not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.761241 4745 scope.go:117] "RemoveContainer" containerID="6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.761580 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4\": container with ID starting with 6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4 not found: ID does not exist" containerID="6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.761608 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4"} err="failed to get container status \"6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4\": rpc error: code = NotFound desc = could not find container \"6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4\": container with ID starting with 6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4 not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.761646 4745 scope.go:117] "RemoveContainer" containerID="595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.761917 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1\": container with ID starting with 595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1 not found: ID does not exist" containerID="595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.761941 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1"} err="failed to get container status \"595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1\": rpc error: code = NotFound desc = could not find container \"595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1\": container with ID starting with 595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1 not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.761953 4745 scope.go:117] "RemoveContainer" containerID="1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.808764 4745 scope.go:117] "RemoveContainer" containerID="3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.826680 4745 scope.go:117] "RemoveContainer" containerID="80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.846710 4745 scope.go:117] "RemoveContainer" containerID="1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.849252 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f\": container with ID starting with 1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f not found: ID does not exist" containerID="1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.849294 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f"} err="failed to get container status \"1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f\": rpc error: code = NotFound desc = could not find container \"1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f\": container with ID starting with 1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.849321 4745 scope.go:117] "RemoveContainer" containerID="3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.850281 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9\": container with ID starting with 3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9 not found: ID does not exist" containerID="3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.850352 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9"} err="failed to get container status \"3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9\": rpc error: code = NotFound desc = could not find container \"3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9\": container with ID starting with 3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9 not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.850400 4745 scope.go:117] "RemoveContainer" containerID="80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.850825 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0\": container with ID starting with 80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0 not found: ID does not exist" containerID="80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.850864 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0"} err="failed to get container status \"80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0\": rpc error: code = NotFound desc = could not find container \"80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0\": container with ID starting with 80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0 not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.850899 4745 scope.go:117] "RemoveContainer" containerID="e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.865300 4745 scope.go:117] "RemoveContainer" containerID="568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.885068 4745 scope.go:117] "RemoveContainer" containerID="478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.906898 4745 scope.go:117] "RemoveContainer" containerID="e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.907509 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a\": container with ID starting with e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a not found: ID does not exist" containerID="e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.907568 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a"} err="failed to get container status \"e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a\": rpc error: code = NotFound desc = could not find container \"e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a\": container with ID starting with e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.907609 4745 scope.go:117] "RemoveContainer" containerID="568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.908302 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9\": container with ID starting with 568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9 not found: ID does not exist" containerID="568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.908473 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9"} err="failed to get container status \"568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9\": rpc error: code = NotFound desc = could not find container \"568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9\": container with ID starting with 568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9 not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.908644 4745 scope.go:117] "RemoveContainer" containerID="478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.909195 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4\": container with ID starting with 478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4 not found: ID does not exist" containerID="478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.909233 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4"} err="failed to get container status \"478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4\": rpc error: code = NotFound desc = could not find container \"478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4\": container with ID starting with 478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4 not found: ID does not exist" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.527490 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" event={"ID":"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06","Type":"ContainerStarted","Data":"55f548fd7298387a80f438dd5f717d328157496f518f27067bafdbde5b40e715"} Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.527947 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.532478 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.546522 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" podStartSLOduration=2.546496338 podStartE2EDuration="2.546496338s" podCreationTimestamp="2026-03-19 00:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:14:25.546257892 +0000 UTC m=+430.084453023" watchObservedRunningTime="2026-03-19 00:14:25.546496338 +0000 UTC m=+430.084691469" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.832030 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j587v"] Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.832218 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.832230 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.832239 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerName="marketplace-operator" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.832245 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerName="marketplace-operator" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.832254 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerName="extract-utilities" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.832263 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerName="extract-utilities" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.832270 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerName="extract-content" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.832276 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerName="extract-content" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.832285 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerName="extract-content" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.832291 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerName="extract-content" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.832300 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerName="extract-utilities" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.832306 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerName="extract-utilities" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.832313 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21b8175-025a-4d91-ad43-389dbad40846" containerName="extract-content" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833298 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21b8175-025a-4d91-ad43-389dbad40846" containerName="extract-content" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.833314 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerName="extract-utilities" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833320 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerName="extract-utilities" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.833328 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833333 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.833344 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21b8175-025a-4d91-ad43-389dbad40846" containerName="extract-utilities" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833349 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21b8175-025a-4d91-ad43-389dbad40846" containerName="extract-utilities" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.833357 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21b8175-025a-4d91-ad43-389dbad40846" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833362 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21b8175-025a-4d91-ad43-389dbad40846" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.833370 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerName="extract-content" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833375 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerName="extract-content" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.833386 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833391 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833495 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833511 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="c21b8175-025a-4d91-ad43-389dbad40846" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833519 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833525 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerName="marketplace-operator" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833532 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerName="marketplace-operator" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833540 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.833625 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerName="marketplace-operator" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833634 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerName="marketplace-operator" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.834244 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.837530 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.846081 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j587v"] Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.924740 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-catalog-content\") pod \"redhat-marketplace-j587v\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.924812 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-utilities\") pod \"redhat-marketplace-j587v\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.924948 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmdx8\" (UniqueName: \"kubernetes.io/projected/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-kube-api-access-fmdx8\") pod \"redhat-marketplace-j587v\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.026494 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-catalog-content\") pod \"redhat-marketplace-j587v\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.026565 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-utilities\") pod \"redhat-marketplace-j587v\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.026620 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmdx8\" (UniqueName: \"kubernetes.io/projected/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-kube-api-access-fmdx8\") pod \"redhat-marketplace-j587v\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.027546 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-utilities\") pod \"redhat-marketplace-j587v\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.027834 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-catalog-content\") pod \"redhat-marketplace-j587v\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.028947 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vj7rp"] Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.030223 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.033360 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.044369 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vj7rp"] Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.055792 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmdx8\" (UniqueName: \"kubernetes.io/projected/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-kube-api-access-fmdx8\") pod \"redhat-marketplace-j587v\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.127724 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6z2b\" (UniqueName: \"kubernetes.io/projected/fa950165-f194-4022-8333-581d7681fc74-kube-api-access-q6z2b\") pod \"certified-operators-vj7rp\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.127840 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-catalog-content\") pod \"certified-operators-vj7rp\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.127864 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-utilities\") pod \"certified-operators-vj7rp\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.144622 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" path="/var/lib/kubelet/pods/0c1d22d3-b584-4622-856c-b531a5d1ad5d/volumes" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.145393 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" path="/var/lib/kubelet/pods/2c3c406d-9994-4629-b585-4d145b1e04aa/volumes" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.146049 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" path="/var/lib/kubelet/pods/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0/volumes" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.147301 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c21b8175-025a-4d91-ad43-389dbad40846" path="/var/lib/kubelet/pods/c21b8175-025a-4d91-ad43-389dbad40846/volumes" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.148017 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" path="/var/lib/kubelet/pods/e2cfb22a-6632-4e35-8145-6e9815e6e76f/volumes" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.156773 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.228843 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-utilities\") pod \"certified-operators-vj7rp\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.230307 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-catalog-content\") pod \"certified-operators-vj7rp\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.230370 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-utilities\") pod \"certified-operators-vj7rp\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.230394 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6z2b\" (UniqueName: \"kubernetes.io/projected/fa950165-f194-4022-8333-581d7681fc74-kube-api-access-q6z2b\") pod \"certified-operators-vj7rp\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.230632 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-catalog-content\") pod \"certified-operators-vj7rp\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.249817 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6z2b\" (UniqueName: \"kubernetes.io/projected/fa950165-f194-4022-8333-581d7681fc74-kube-api-access-q6z2b\") pod \"certified-operators-vj7rp\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.345777 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j587v"] Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.355644 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.535325 4745 generic.go:334] "Generic (PLEG): container finished" podID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerID="d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae" exitCode=0 Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.535384 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j587v" event={"ID":"0dcd96ee-b500-4027-8a29-f0d6f59ea06b","Type":"ContainerDied","Data":"d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae"} Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.536522 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j587v" event={"ID":"0dcd96ee-b500-4027-8a29-f0d6f59ea06b","Type":"ContainerStarted","Data":"5e90ccfe8369ff143297fb00f49862c706dc2eb6a69c3dc1f5670ef331a15a02"} Mar 19 00:14:27 crc kubenswrapper[4745]: I0319 00:14:27.078808 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vj7rp"] Mar 19 00:14:27 crc kubenswrapper[4745]: I0319 00:14:27.541903 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j587v" event={"ID":"0dcd96ee-b500-4027-8a29-f0d6f59ea06b","Type":"ContainerStarted","Data":"a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c"} Mar 19 00:14:27 crc kubenswrapper[4745]: I0319 00:14:27.544393 4745 generic.go:334] "Generic (PLEG): container finished" podID="fa950165-f194-4022-8333-581d7681fc74" containerID="04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc" exitCode=0 Mar 19 00:14:27 crc kubenswrapper[4745]: I0319 00:14:27.544497 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vj7rp" event={"ID":"fa950165-f194-4022-8333-581d7681fc74","Type":"ContainerDied","Data":"04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc"} Mar 19 00:14:27 crc kubenswrapper[4745]: I0319 00:14:27.544527 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vj7rp" event={"ID":"fa950165-f194-4022-8333-581d7681fc74","Type":"ContainerStarted","Data":"c186af9e1f5669e5491c37e499f3a6a8a28b64cddfa66b87effddfaec8dbd826"} Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.238674 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pq2gm"] Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.239811 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.246298 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.249271 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pq2gm"] Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.371925 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ff66d22-2b4c-4f11-acfe-06173ee9a07e-catalog-content\") pod \"community-operators-pq2gm\" (UID: \"0ff66d22-2b4c-4f11-acfe-06173ee9a07e\") " pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.371987 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-627lx\" (UniqueName: \"kubernetes.io/projected/0ff66d22-2b4c-4f11-acfe-06173ee9a07e-kube-api-access-627lx\") pod \"community-operators-pq2gm\" (UID: \"0ff66d22-2b4c-4f11-acfe-06173ee9a07e\") " pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.372040 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ff66d22-2b4c-4f11-acfe-06173ee9a07e-utilities\") pod \"community-operators-pq2gm\" (UID: \"0ff66d22-2b4c-4f11-acfe-06173ee9a07e\") " pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.431128 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6ddfl"] Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.432251 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.435774 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.449505 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6ddfl"] Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.472808 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-627lx\" (UniqueName: \"kubernetes.io/projected/0ff66d22-2b4c-4f11-acfe-06173ee9a07e-kube-api-access-627lx\") pod \"community-operators-pq2gm\" (UID: \"0ff66d22-2b4c-4f11-acfe-06173ee9a07e\") " pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.472920 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ff66d22-2b4c-4f11-acfe-06173ee9a07e-utilities\") pod \"community-operators-pq2gm\" (UID: \"0ff66d22-2b4c-4f11-acfe-06173ee9a07e\") " pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.472960 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ff66d22-2b4c-4f11-acfe-06173ee9a07e-catalog-content\") pod \"community-operators-pq2gm\" (UID: \"0ff66d22-2b4c-4f11-acfe-06173ee9a07e\") " pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.473696 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ff66d22-2b4c-4f11-acfe-06173ee9a07e-catalog-content\") pod \"community-operators-pq2gm\" (UID: \"0ff66d22-2b4c-4f11-acfe-06173ee9a07e\") " pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.473782 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ff66d22-2b4c-4f11-acfe-06173ee9a07e-utilities\") pod \"community-operators-pq2gm\" (UID: \"0ff66d22-2b4c-4f11-acfe-06173ee9a07e\") " pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.494351 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-627lx\" (UniqueName: \"kubernetes.io/projected/0ff66d22-2b4c-4f11-acfe-06173ee9a07e-kube-api-access-627lx\") pod \"community-operators-pq2gm\" (UID: \"0ff66d22-2b4c-4f11-acfe-06173ee9a07e\") " pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.551693 4745 generic.go:334] "Generic (PLEG): container finished" podID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerID="a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c" exitCode=0 Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.551784 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j587v" event={"ID":"0dcd96ee-b500-4027-8a29-f0d6f59ea06b","Type":"ContainerDied","Data":"a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c"} Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.554721 4745 generic.go:334] "Generic (PLEG): container finished" podID="fa950165-f194-4022-8333-581d7681fc74" containerID="e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0" exitCode=0 Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.554774 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vj7rp" event={"ID":"fa950165-f194-4022-8333-581d7681fc74","Type":"ContainerDied","Data":"e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0"} Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.564299 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.574108 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5-catalog-content\") pod \"redhat-operators-6ddfl\" (UID: \"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5\") " pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.574174 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5-utilities\") pod \"redhat-operators-6ddfl\" (UID: \"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5\") " pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.574230 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjlc2\" (UniqueName: \"kubernetes.io/projected/e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5-kube-api-access-fjlc2\") pod \"redhat-operators-6ddfl\" (UID: \"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5\") " pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.675242 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5-catalog-content\") pod \"redhat-operators-6ddfl\" (UID: \"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5\") " pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.675305 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5-utilities\") pod \"redhat-operators-6ddfl\" (UID: \"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5\") " pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.675368 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjlc2\" (UniqueName: \"kubernetes.io/projected/e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5-kube-api-access-fjlc2\") pod \"redhat-operators-6ddfl\" (UID: \"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5\") " pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.676198 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5-catalog-content\") pod \"redhat-operators-6ddfl\" (UID: \"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5\") " pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.677031 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5-utilities\") pod \"redhat-operators-6ddfl\" (UID: \"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5\") " pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.696432 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjlc2\" (UniqueName: \"kubernetes.io/projected/e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5-kube-api-access-fjlc2\") pod \"redhat-operators-6ddfl\" (UID: \"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5\") " pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.758016 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.954658 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pq2gm"] Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.966913 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6ddfl"] Mar 19 00:14:28 crc kubenswrapper[4745]: W0319 00:14:28.976962 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ff66d22_2b4c_4f11_acfe_06173ee9a07e.slice/crio-f72e647bd76072242fa00811de7ea6fdad9d41ecc43e4826ec211c23023a16b0 WatchSource:0}: Error finding container f72e647bd76072242fa00811de7ea6fdad9d41ecc43e4826ec211c23023a16b0: Status 404 returned error can't find the container with id f72e647bd76072242fa00811de7ea6fdad9d41ecc43e4826ec211c23023a16b0 Mar 19 00:14:28 crc kubenswrapper[4745]: W0319 00:14:28.981130 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5ffb13a_ff99_429d_bfdc_cdd2a43b90c5.slice/crio-90c05f87712d8c3058349483ac97f96cec39a3d4ee4fe2ecdc7818841be4f878 WatchSource:0}: Error finding container 90c05f87712d8c3058349483ac97f96cec39a3d4ee4fe2ecdc7818841be4f878: Status 404 returned error can't find the container with id 90c05f87712d8c3058349483ac97f96cec39a3d4ee4fe2ecdc7818841be4f878 Mar 19 00:14:29 crc kubenswrapper[4745]: I0319 00:14:29.562307 4745 generic.go:334] "Generic (PLEG): container finished" podID="e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5" containerID="84d24d50c83144d9d3b72ba47fd55ce8d6a46eea80ce2a350b7057cb367e05e7" exitCode=0 Mar 19 00:14:29 crc kubenswrapper[4745]: I0319 00:14:29.563641 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ddfl" event={"ID":"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5","Type":"ContainerDied","Data":"84d24d50c83144d9d3b72ba47fd55ce8d6a46eea80ce2a350b7057cb367e05e7"} Mar 19 00:14:29 crc kubenswrapper[4745]: I0319 00:14:29.564196 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ddfl" event={"ID":"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5","Type":"ContainerStarted","Data":"90c05f87712d8c3058349483ac97f96cec39a3d4ee4fe2ecdc7818841be4f878"} Mar 19 00:14:29 crc kubenswrapper[4745]: I0319 00:14:29.570517 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vj7rp" event={"ID":"fa950165-f194-4022-8333-581d7681fc74","Type":"ContainerStarted","Data":"d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f"} Mar 19 00:14:29 crc kubenswrapper[4745]: I0319 00:14:29.572728 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j587v" event={"ID":"0dcd96ee-b500-4027-8a29-f0d6f59ea06b","Type":"ContainerStarted","Data":"60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a"} Mar 19 00:14:29 crc kubenswrapper[4745]: I0319 00:14:29.574585 4745 generic.go:334] "Generic (PLEG): container finished" podID="0ff66d22-2b4c-4f11-acfe-06173ee9a07e" containerID="e1a1ce8f7fbe5a5d7423efce319a2083c3af785662842d0075f56e3632d01311" exitCode=0 Mar 19 00:14:29 crc kubenswrapper[4745]: I0319 00:14:29.574624 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pq2gm" event={"ID":"0ff66d22-2b4c-4f11-acfe-06173ee9a07e","Type":"ContainerDied","Data":"e1a1ce8f7fbe5a5d7423efce319a2083c3af785662842d0075f56e3632d01311"} Mar 19 00:14:29 crc kubenswrapper[4745]: I0319 00:14:29.574654 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pq2gm" event={"ID":"0ff66d22-2b4c-4f11-acfe-06173ee9a07e","Type":"ContainerStarted","Data":"f72e647bd76072242fa00811de7ea6fdad9d41ecc43e4826ec211c23023a16b0"} Mar 19 00:14:29 crc kubenswrapper[4745]: I0319 00:14:29.632732 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vj7rp" podStartSLOduration=2.220038687 podStartE2EDuration="3.632710551s" podCreationTimestamp="2026-03-19 00:14:26 +0000 UTC" firstStartedPulling="2026-03-19 00:14:27.546474736 +0000 UTC m=+432.084669867" lastFinishedPulling="2026-03-19 00:14:28.95914661 +0000 UTC m=+433.497341731" observedRunningTime="2026-03-19 00:14:29.610156047 +0000 UTC m=+434.148351178" watchObservedRunningTime="2026-03-19 00:14:29.632710551 +0000 UTC m=+434.170905682" Mar 19 00:14:29 crc kubenswrapper[4745]: I0319 00:14:29.649544 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j587v" podStartSLOduration=2.138866935 podStartE2EDuration="4.649522982s" podCreationTimestamp="2026-03-19 00:14:25 +0000 UTC" firstStartedPulling="2026-03-19 00:14:26.540233895 +0000 UTC m=+431.078429026" lastFinishedPulling="2026-03-19 00:14:29.050889942 +0000 UTC m=+433.589085073" observedRunningTime="2026-03-19 00:14:29.645319099 +0000 UTC m=+434.183514220" watchObservedRunningTime="2026-03-19 00:14:29.649522982 +0000 UTC m=+434.187718123" Mar 19 00:14:30 crc kubenswrapper[4745]: I0319 00:14:30.591679 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pq2gm" event={"ID":"0ff66d22-2b4c-4f11-acfe-06173ee9a07e","Type":"ContainerStarted","Data":"0c7790717956205103ceba96837b625aab857083930a1640d4c0abb4cb000f52"} Mar 19 00:14:31 crc kubenswrapper[4745]: I0319 00:14:31.616699 4745 generic.go:334] "Generic (PLEG): container finished" podID="e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5" containerID="dcaff950c5d146e7e21523621dbc11cbe640f22926270d35b3c7e51b5bca76c2" exitCode=0 Mar 19 00:14:31 crc kubenswrapper[4745]: I0319 00:14:31.617388 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ddfl" event={"ID":"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5","Type":"ContainerDied","Data":"dcaff950c5d146e7e21523621dbc11cbe640f22926270d35b3c7e51b5bca76c2"} Mar 19 00:14:31 crc kubenswrapper[4745]: I0319 00:14:31.630738 4745 generic.go:334] "Generic (PLEG): container finished" podID="0ff66d22-2b4c-4f11-acfe-06173ee9a07e" containerID="0c7790717956205103ceba96837b625aab857083930a1640d4c0abb4cb000f52" exitCode=0 Mar 19 00:14:31 crc kubenswrapper[4745]: I0319 00:14:31.630776 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pq2gm" event={"ID":"0ff66d22-2b4c-4f11-acfe-06173ee9a07e","Type":"ContainerDied","Data":"0c7790717956205103ceba96837b625aab857083930a1640d4c0abb4cb000f52"} Mar 19 00:14:31 crc kubenswrapper[4745]: I0319 00:14:31.988101 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:32 crc kubenswrapper[4745]: I0319 00:14:32.048706 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggt62"] Mar 19 00:14:32 crc kubenswrapper[4745]: I0319 00:14:32.639428 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pq2gm" event={"ID":"0ff66d22-2b4c-4f11-acfe-06173ee9a07e","Type":"ContainerStarted","Data":"d0adfba1067338989c5d93f619f49c1e3c7884671c15c26bb96ac736763d46a5"} Mar 19 00:14:32 crc kubenswrapper[4745]: I0319 00:14:32.646294 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ddfl" event={"ID":"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5","Type":"ContainerStarted","Data":"184bd62fbc5252a7e817993caec127eccacb7441498ad10ba91863fede3ce124"} Mar 19 00:14:32 crc kubenswrapper[4745]: I0319 00:14:32.660990 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pq2gm" podStartSLOduration=2.186050609 podStartE2EDuration="4.660971275s" podCreationTimestamp="2026-03-19 00:14:28 +0000 UTC" firstStartedPulling="2026-03-19 00:14:29.57578992 +0000 UTC m=+434.113985061" lastFinishedPulling="2026-03-19 00:14:32.050710596 +0000 UTC m=+436.588905727" observedRunningTime="2026-03-19 00:14:32.657282188 +0000 UTC m=+437.195477319" watchObservedRunningTime="2026-03-19 00:14:32.660971275 +0000 UTC m=+437.199166406" Mar 19 00:14:32 crc kubenswrapper[4745]: I0319 00:14:32.677291 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6ddfl" podStartSLOduration=2.148559322 podStartE2EDuration="4.67727351s" podCreationTimestamp="2026-03-19 00:14:28 +0000 UTC" firstStartedPulling="2026-03-19 00:14:29.565756793 +0000 UTC m=+434.103951924" lastFinishedPulling="2026-03-19 00:14:32.094470971 +0000 UTC m=+436.632666112" observedRunningTime="2026-03-19 00:14:32.673991797 +0000 UTC m=+437.212186938" watchObservedRunningTime="2026-03-19 00:14:32.67727351 +0000 UTC m=+437.215468641" Mar 19 00:14:36 crc kubenswrapper[4745]: I0319 00:14:36.157844 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:36 crc kubenswrapper[4745]: I0319 00:14:36.158249 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:36 crc kubenswrapper[4745]: I0319 00:14:36.202865 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:36 crc kubenswrapper[4745]: I0319 00:14:36.356792 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:36 crc kubenswrapper[4745]: I0319 00:14:36.356864 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:36 crc kubenswrapper[4745]: I0319 00:14:36.400963 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:36 crc kubenswrapper[4745]: I0319 00:14:36.707934 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:36 crc kubenswrapper[4745]: I0319 00:14:36.710284 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:38 crc kubenswrapper[4745]: I0319 00:14:38.565354 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:38 crc kubenswrapper[4745]: I0319 00:14:38.565697 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:38 crc kubenswrapper[4745]: I0319 00:14:38.607896 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:38 crc kubenswrapper[4745]: I0319 00:14:38.716324 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:38 crc kubenswrapper[4745]: I0319 00:14:38.758462 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:38 crc kubenswrapper[4745]: I0319 00:14:38.758678 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:38 crc kubenswrapper[4745]: I0319 00:14:38.795842 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:39 crc kubenswrapper[4745]: I0319 00:14:39.716182 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:45 crc kubenswrapper[4745]: I0319 00:14:45.606205 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:14:45 crc kubenswrapper[4745]: I0319 00:14:45.606569 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.093487 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" podUID="b246ac53-9c42-426b-97da-3ca4075766ab" containerName="registry" containerID="cri-o://81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0" gracePeriod=30 Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.435747 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.530809 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-bound-sa-token\") pod \"b246ac53-9c42-426b-97da-3ca4075766ab\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.530956 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b246ac53-9c42-426b-97da-3ca4075766ab-installation-pull-secrets\") pod \"b246ac53-9c42-426b-97da-3ca4075766ab\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.531001 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-registry-certificates\") pod \"b246ac53-9c42-426b-97da-3ca4075766ab\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.531355 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b246ac53-9c42-426b-97da-3ca4075766ab\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.531433 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-registry-tls\") pod \"b246ac53-9c42-426b-97da-3ca4075766ab\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.531782 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-trusted-ca\") pod \"b246ac53-9c42-426b-97da-3ca4075766ab\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.532627 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b246ac53-9c42-426b-97da-3ca4075766ab" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.532761 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b246ac53-9c42-426b-97da-3ca4075766ab" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.533124 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6wxc\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-kube-api-access-z6wxc\") pod \"b246ac53-9c42-426b-97da-3ca4075766ab\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.533167 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b246ac53-9c42-426b-97da-3ca4075766ab-ca-trust-extracted\") pod \"b246ac53-9c42-426b-97da-3ca4075766ab\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.533665 4745 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.533691 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.539515 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b246ac53-9c42-426b-97da-3ca4075766ab-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b246ac53-9c42-426b-97da-3ca4075766ab" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.541009 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b246ac53-9c42-426b-97da-3ca4075766ab" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.541909 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-kube-api-access-z6wxc" (OuterVolumeSpecName: "kube-api-access-z6wxc") pod "b246ac53-9c42-426b-97da-3ca4075766ab" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab"). InnerVolumeSpecName "kube-api-access-z6wxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.542928 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b246ac53-9c42-426b-97da-3ca4075766ab" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.544783 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b246ac53-9c42-426b-97da-3ca4075766ab" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.551934 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b246ac53-9c42-426b-97da-3ca4075766ab-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b246ac53-9c42-426b-97da-3ca4075766ab" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.635359 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6wxc\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-kube-api-access-z6wxc\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.635409 4745 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b246ac53-9c42-426b-97da-3ca4075766ab-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.635440 4745 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.635453 4745 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b246ac53-9c42-426b-97da-3ca4075766ab-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.635467 4745 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.793617 4745 generic.go:334] "Generic (PLEG): container finished" podID="b246ac53-9c42-426b-97da-3ca4075766ab" containerID="81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0" exitCode=0 Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.793661 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" event={"ID":"b246ac53-9c42-426b-97da-3ca4075766ab","Type":"ContainerDied","Data":"81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0"} Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.793696 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" event={"ID":"b246ac53-9c42-426b-97da-3ca4075766ab","Type":"ContainerDied","Data":"de743d9871d9e519d057ac18763fdb10aeceb0154e79a293ccaea85445d780d3"} Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.793719 4745 scope.go:117] "RemoveContainer" containerID="81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.793748 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.810965 4745 scope.go:117] "RemoveContainer" containerID="81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0" Mar 19 00:14:57 crc kubenswrapper[4745]: E0319 00:14:57.811511 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0\": container with ID starting with 81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0 not found: ID does not exist" containerID="81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.811559 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0"} err="failed to get container status \"81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0\": rpc error: code = NotFound desc = could not find container \"81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0\": container with ID starting with 81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0 not found: ID does not exist" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.828841 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggt62"] Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.832742 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggt62"] Mar 19 00:14:58 crc kubenswrapper[4745]: I0319 00:14:58.145682 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b246ac53-9c42-426b-97da-3ca4075766ab" path="/var/lib/kubelet/pods/b246ac53-9c42-426b-97da-3ca4075766ab/volumes" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.145970 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj"] Mar 19 00:15:00 crc kubenswrapper[4745]: E0319 00:15:00.146154 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b246ac53-9c42-426b-97da-3ca4075766ab" containerName="registry" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.146167 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b246ac53-9c42-426b-97da-3ca4075766ab" containerName="registry" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.146269 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="b246ac53-9c42-426b-97da-3ca4075766ab" containerName="registry" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.146654 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.149938 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.150124 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.152504 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj"] Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.267377 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9310ac17-728f-450e-9e05-4159ab257626-secret-volume\") pod \"collect-profiles-29564655-tq8kj\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.267456 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9310ac17-728f-450e-9e05-4159ab257626-config-volume\") pod \"collect-profiles-29564655-tq8kj\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.267495 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfbsl\" (UniqueName: \"kubernetes.io/projected/9310ac17-728f-450e-9e05-4159ab257626-kube-api-access-kfbsl\") pod \"collect-profiles-29564655-tq8kj\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.368164 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9310ac17-728f-450e-9e05-4159ab257626-secret-volume\") pod \"collect-profiles-29564655-tq8kj\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.368260 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9310ac17-728f-450e-9e05-4159ab257626-config-volume\") pod \"collect-profiles-29564655-tq8kj\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.368294 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfbsl\" (UniqueName: \"kubernetes.io/projected/9310ac17-728f-450e-9e05-4159ab257626-kube-api-access-kfbsl\") pod \"collect-profiles-29564655-tq8kj\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.369430 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9310ac17-728f-450e-9e05-4159ab257626-config-volume\") pod \"collect-profiles-29564655-tq8kj\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.375325 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9310ac17-728f-450e-9e05-4159ab257626-secret-volume\") pod \"collect-profiles-29564655-tq8kj\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.386799 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfbsl\" (UniqueName: \"kubernetes.io/projected/9310ac17-728f-450e-9e05-4159ab257626-kube-api-access-kfbsl\") pod \"collect-profiles-29564655-tq8kj\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.467015 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.714927 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj"] Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.812111 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" event={"ID":"9310ac17-728f-450e-9e05-4159ab257626","Type":"ContainerStarted","Data":"051f03921ea5e42b6a343d7a56bd3fd8fbad5e12835e07f347b447cf3fcb073c"} Mar 19 00:15:01 crc kubenswrapper[4745]: I0319 00:15:01.818428 4745 generic.go:334] "Generic (PLEG): container finished" podID="9310ac17-728f-450e-9e05-4159ab257626" containerID="3418c0630502ef3e4a53904e75f1c29d16c5a3f13f1d69499262a568825926a3" exitCode=0 Mar 19 00:15:01 crc kubenswrapper[4745]: I0319 00:15:01.818480 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" event={"ID":"9310ac17-728f-450e-9e05-4159ab257626","Type":"ContainerDied","Data":"3418c0630502ef3e4a53904e75f1c29d16c5a3f13f1d69499262a568825926a3"} Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.030116 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.099173 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfbsl\" (UniqueName: \"kubernetes.io/projected/9310ac17-728f-450e-9e05-4159ab257626-kube-api-access-kfbsl\") pod \"9310ac17-728f-450e-9e05-4159ab257626\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.099264 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9310ac17-728f-450e-9e05-4159ab257626-secret-volume\") pod \"9310ac17-728f-450e-9e05-4159ab257626\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.099323 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9310ac17-728f-450e-9e05-4159ab257626-config-volume\") pod \"9310ac17-728f-450e-9e05-4159ab257626\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.100416 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9310ac17-728f-450e-9e05-4159ab257626-config-volume" (OuterVolumeSpecName: "config-volume") pod "9310ac17-728f-450e-9e05-4159ab257626" (UID: "9310ac17-728f-450e-9e05-4159ab257626"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.104971 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9310ac17-728f-450e-9e05-4159ab257626-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9310ac17-728f-450e-9e05-4159ab257626" (UID: "9310ac17-728f-450e-9e05-4159ab257626"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.105771 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9310ac17-728f-450e-9e05-4159ab257626-kube-api-access-kfbsl" (OuterVolumeSpecName: "kube-api-access-kfbsl") pod "9310ac17-728f-450e-9e05-4159ab257626" (UID: "9310ac17-728f-450e-9e05-4159ab257626"). InnerVolumeSpecName "kube-api-access-kfbsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.200677 4745 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9310ac17-728f-450e-9e05-4159ab257626-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.200735 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfbsl\" (UniqueName: \"kubernetes.io/projected/9310ac17-728f-450e-9e05-4159ab257626-kube-api-access-kfbsl\") on node \"crc\" DevicePath \"\"" Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.200750 4745 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9310ac17-728f-450e-9e05-4159ab257626-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.830373 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" event={"ID":"9310ac17-728f-450e-9e05-4159ab257626","Type":"ContainerDied","Data":"051f03921ea5e42b6a343d7a56bd3fd8fbad5e12835e07f347b447cf3fcb073c"} Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.830407 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="051f03921ea5e42b6a343d7a56bd3fd8fbad5e12835e07f347b447cf3fcb073c" Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.830442 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:15 crc kubenswrapper[4745]: I0319 00:15:15.606018 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:15:15 crc kubenswrapper[4745]: I0319 00:15:15.607099 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:15:15 crc kubenswrapper[4745]: I0319 00:15:15.607209 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:15:15 crc kubenswrapper[4745]: I0319 00:15:15.608757 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"de10a1d38d98124c46ca0c82dd88e28606d5f5ae568f904a533071011c2c4c10"} pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 00:15:15 crc kubenswrapper[4745]: I0319 00:15:15.609029 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" containerID="cri-o://de10a1d38d98124c46ca0c82dd88e28606d5f5ae568f904a533071011c2c4c10" gracePeriod=600 Mar 19 00:15:15 crc kubenswrapper[4745]: I0319 00:15:15.911446 4745 generic.go:334] "Generic (PLEG): container finished" podID="400972f4-050f-4f26-b982-ced6f2590c8b" containerID="de10a1d38d98124c46ca0c82dd88e28606d5f5ae568f904a533071011c2c4c10" exitCode=0 Mar 19 00:15:15 crc kubenswrapper[4745]: I0319 00:15:15.911510 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerDied","Data":"de10a1d38d98124c46ca0c82dd88e28606d5f5ae568f904a533071011c2c4c10"} Mar 19 00:15:15 crc kubenswrapper[4745]: I0319 00:15:15.911830 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"051b71cd1adf671f1f1535a2cdc76bfdb90671964e234784e73bcde9b37bd06d"} Mar 19 00:15:15 crc kubenswrapper[4745]: I0319 00:15:15.911864 4745 scope.go:117] "RemoveContainer" containerID="e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.147722 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564656-7jksw"] Mar 19 00:16:00 crc kubenswrapper[4745]: E0319 00:16:00.149411 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9310ac17-728f-450e-9e05-4159ab257626" containerName="collect-profiles" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.149483 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9310ac17-728f-450e-9e05-4159ab257626" containerName="collect-profiles" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.149655 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="9310ac17-728f-450e-9e05-4159ab257626" containerName="collect-profiles" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.150086 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564656-7jksw"] Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.150306 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564656-7jksw" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.153916 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.154098 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.154549 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.336892 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dqk6\" (UniqueName: \"kubernetes.io/projected/f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1-kube-api-access-6dqk6\") pod \"auto-csr-approver-29564656-7jksw\" (UID: \"f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1\") " pod="openshift-infra/auto-csr-approver-29564656-7jksw" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.441950 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dqk6\" (UniqueName: \"kubernetes.io/projected/f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1-kube-api-access-6dqk6\") pod \"auto-csr-approver-29564656-7jksw\" (UID: \"f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1\") " pod="openshift-infra/auto-csr-approver-29564656-7jksw" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.463183 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dqk6\" (UniqueName: \"kubernetes.io/projected/f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1-kube-api-access-6dqk6\") pod \"auto-csr-approver-29564656-7jksw\" (UID: \"f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1\") " pod="openshift-infra/auto-csr-approver-29564656-7jksw" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.471092 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564656-7jksw" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.666226 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564656-7jksw"] Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.692274 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 00:16:01 crc kubenswrapper[4745]: I0319 00:16:01.176360 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564656-7jksw" event={"ID":"f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1","Type":"ContainerStarted","Data":"1a176777ee24fdbcfa0fdb9bbe56ac8ed1ef043829a0399e6cd32adf043133da"} Mar 19 00:16:02 crc kubenswrapper[4745]: I0319 00:16:02.182253 4745 generic.go:334] "Generic (PLEG): container finished" podID="f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1" containerID="4cf1138f66461b0db8f8d82a562c5595a0d75aaab97369e7761de279cdf0fb9b" exitCode=0 Mar 19 00:16:02 crc kubenswrapper[4745]: I0319 00:16:02.182297 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564656-7jksw" event={"ID":"f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1","Type":"ContainerDied","Data":"4cf1138f66461b0db8f8d82a562c5595a0d75aaab97369e7761de279cdf0fb9b"} Mar 19 00:16:03 crc kubenswrapper[4745]: I0319 00:16:03.385070 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564656-7jksw" Mar 19 00:16:03 crc kubenswrapper[4745]: I0319 00:16:03.579086 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dqk6\" (UniqueName: \"kubernetes.io/projected/f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1-kube-api-access-6dqk6\") pod \"f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1\" (UID: \"f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1\") " Mar 19 00:16:03 crc kubenswrapper[4745]: I0319 00:16:03.589449 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1-kube-api-access-6dqk6" (OuterVolumeSpecName: "kube-api-access-6dqk6") pod "f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1" (UID: "f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1"). InnerVolumeSpecName "kube-api-access-6dqk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:16:03 crc kubenswrapper[4745]: I0319 00:16:03.680680 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dqk6\" (UniqueName: \"kubernetes.io/projected/f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1-kube-api-access-6dqk6\") on node \"crc\" DevicePath \"\"" Mar 19 00:16:04 crc kubenswrapper[4745]: I0319 00:16:04.195374 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564656-7jksw" event={"ID":"f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1","Type":"ContainerDied","Data":"1a176777ee24fdbcfa0fdb9bbe56ac8ed1ef043829a0399e6cd32adf043133da"} Mar 19 00:16:04 crc kubenswrapper[4745]: I0319 00:16:04.195411 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a176777ee24fdbcfa0fdb9bbe56ac8ed1ef043829a0399e6cd32adf043133da" Mar 19 00:16:04 crc kubenswrapper[4745]: I0319 00:16:04.195471 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564656-7jksw" Mar 19 00:16:04 crc kubenswrapper[4745]: I0319 00:16:04.448503 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564650-7k6ld"] Mar 19 00:16:04 crc kubenswrapper[4745]: I0319 00:16:04.455019 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564650-7k6ld"] Mar 19 00:16:06 crc kubenswrapper[4745]: I0319 00:16:06.146844 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d14d18f5-0177-4458-8ea3-b266cc96d658" path="/var/lib/kubelet/pods/d14d18f5-0177-4458-8ea3-b266cc96d658/volumes" Mar 19 00:17:15 crc kubenswrapper[4745]: I0319 00:17:15.606759 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:17:15 crc kubenswrapper[4745]: I0319 00:17:15.607465 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:17:45 crc kubenswrapper[4745]: I0319 00:17:45.606586 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:17:45 crc kubenswrapper[4745]: I0319 00:17:45.608525 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.154524 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564658-6vdd5"] Mar 19 00:18:00 crc kubenswrapper[4745]: E0319 00:18:00.155482 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1" containerName="oc" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.155498 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1" containerName="oc" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.155614 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1" containerName="oc" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.156066 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564658-6vdd5" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.162819 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564658-6vdd5"] Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.172284 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.172796 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.173024 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.313353 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d94f\" (UniqueName: \"kubernetes.io/projected/7807a7d0-ff52-4a76-b083-19eca144b510-kube-api-access-9d94f\") pod \"auto-csr-approver-29564658-6vdd5\" (UID: \"7807a7d0-ff52-4a76-b083-19eca144b510\") " pod="openshift-infra/auto-csr-approver-29564658-6vdd5" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.414899 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d94f\" (UniqueName: \"kubernetes.io/projected/7807a7d0-ff52-4a76-b083-19eca144b510-kube-api-access-9d94f\") pod \"auto-csr-approver-29564658-6vdd5\" (UID: \"7807a7d0-ff52-4a76-b083-19eca144b510\") " pod="openshift-infra/auto-csr-approver-29564658-6vdd5" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.434688 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d94f\" (UniqueName: \"kubernetes.io/projected/7807a7d0-ff52-4a76-b083-19eca144b510-kube-api-access-9d94f\") pod \"auto-csr-approver-29564658-6vdd5\" (UID: \"7807a7d0-ff52-4a76-b083-19eca144b510\") " pod="openshift-infra/auto-csr-approver-29564658-6vdd5" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.489826 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564658-6vdd5" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.684699 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564658-6vdd5"] Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.959812 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564658-6vdd5" event={"ID":"7807a7d0-ff52-4a76-b083-19eca144b510","Type":"ContainerStarted","Data":"3165a41ad3cd4db1417a502aaf6684541dbe44ff096ef9becfcf62199c1f7dac"} Mar 19 00:18:01 crc kubenswrapper[4745]: I0319 00:18:01.966836 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564658-6vdd5" event={"ID":"7807a7d0-ff52-4a76-b083-19eca144b510","Type":"ContainerStarted","Data":"91311d7617172e5175d1b2c1df977704664ce95b1113f4d27a4b6a3392f4c27c"} Mar 19 00:18:01 crc kubenswrapper[4745]: I0319 00:18:01.982164 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564658-6vdd5" podStartSLOduration=1.08076951 podStartE2EDuration="1.982140234s" podCreationTimestamp="2026-03-19 00:18:00 +0000 UTC" firstStartedPulling="2026-03-19 00:18:00.695021216 +0000 UTC m=+645.233216347" lastFinishedPulling="2026-03-19 00:18:01.59639194 +0000 UTC m=+646.134587071" observedRunningTime="2026-03-19 00:18:01.980361689 +0000 UTC m=+646.518556830" watchObservedRunningTime="2026-03-19 00:18:01.982140234 +0000 UTC m=+646.520335365" Mar 19 00:18:02 crc kubenswrapper[4745]: I0319 00:18:02.974284 4745 generic.go:334] "Generic (PLEG): container finished" podID="7807a7d0-ff52-4a76-b083-19eca144b510" containerID="91311d7617172e5175d1b2c1df977704664ce95b1113f4d27a4b6a3392f4c27c" exitCode=0 Mar 19 00:18:02 crc kubenswrapper[4745]: I0319 00:18:02.974378 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564658-6vdd5" event={"ID":"7807a7d0-ff52-4a76-b083-19eca144b510","Type":"ContainerDied","Data":"91311d7617172e5175d1b2c1df977704664ce95b1113f4d27a4b6a3392f4c27c"} Mar 19 00:18:04 crc kubenswrapper[4745]: I0319 00:18:04.224603 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564658-6vdd5" Mar 19 00:18:04 crc kubenswrapper[4745]: I0319 00:18:04.365358 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d94f\" (UniqueName: \"kubernetes.io/projected/7807a7d0-ff52-4a76-b083-19eca144b510-kube-api-access-9d94f\") pod \"7807a7d0-ff52-4a76-b083-19eca144b510\" (UID: \"7807a7d0-ff52-4a76-b083-19eca144b510\") " Mar 19 00:18:04 crc kubenswrapper[4745]: I0319 00:18:04.373260 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7807a7d0-ff52-4a76-b083-19eca144b510-kube-api-access-9d94f" (OuterVolumeSpecName: "kube-api-access-9d94f") pod "7807a7d0-ff52-4a76-b083-19eca144b510" (UID: "7807a7d0-ff52-4a76-b083-19eca144b510"). InnerVolumeSpecName "kube-api-access-9d94f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:18:04 crc kubenswrapper[4745]: I0319 00:18:04.467412 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d94f\" (UniqueName: \"kubernetes.io/projected/7807a7d0-ff52-4a76-b083-19eca144b510-kube-api-access-9d94f\") on node \"crc\" DevicePath \"\"" Mar 19 00:18:04 crc kubenswrapper[4745]: I0319 00:18:04.988963 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564658-6vdd5" event={"ID":"7807a7d0-ff52-4a76-b083-19eca144b510","Type":"ContainerDied","Data":"3165a41ad3cd4db1417a502aaf6684541dbe44ff096ef9becfcf62199c1f7dac"} Mar 19 00:18:04 crc kubenswrapper[4745]: I0319 00:18:04.989966 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3165a41ad3cd4db1417a502aaf6684541dbe44ff096ef9becfcf62199c1f7dac" Mar 19 00:18:04 crc kubenswrapper[4745]: I0319 00:18:04.988994 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564658-6vdd5" Mar 19 00:18:05 crc kubenswrapper[4745]: I0319 00:18:05.046328 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564652-nhhsh"] Mar 19 00:18:05 crc kubenswrapper[4745]: I0319 00:18:05.050231 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564652-nhhsh"] Mar 19 00:18:06 crc kubenswrapper[4745]: I0319 00:18:06.147594 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ef55829-c74d-4c78-b9b9-1c3ea05456e9" path="/var/lib/kubelet/pods/9ef55829-c74d-4c78-b9b9-1c3ea05456e9/volumes" Mar 19 00:18:15 crc kubenswrapper[4745]: I0319 00:18:15.606250 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:18:15 crc kubenswrapper[4745]: I0319 00:18:15.607169 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:18:15 crc kubenswrapper[4745]: I0319 00:18:15.607252 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:18:15 crc kubenswrapper[4745]: I0319 00:18:15.608451 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"051b71cd1adf671f1f1535a2cdc76bfdb90671964e234784e73bcde9b37bd06d"} pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 00:18:15 crc kubenswrapper[4745]: I0319 00:18:15.608559 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" containerID="cri-o://051b71cd1adf671f1f1535a2cdc76bfdb90671964e234784e73bcde9b37bd06d" gracePeriod=600 Mar 19 00:18:16 crc kubenswrapper[4745]: I0319 00:18:16.083811 4745 generic.go:334] "Generic (PLEG): container finished" podID="400972f4-050f-4f26-b982-ced6f2590c8b" containerID="051b71cd1adf671f1f1535a2cdc76bfdb90671964e234784e73bcde9b37bd06d" exitCode=0 Mar 19 00:18:16 crc kubenswrapper[4745]: I0319 00:18:16.083982 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerDied","Data":"051b71cd1adf671f1f1535a2cdc76bfdb90671964e234784e73bcde9b37bd06d"} Mar 19 00:18:16 crc kubenswrapper[4745]: I0319 00:18:16.084354 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"c5fdbb323d46092e1c016d6578086be40b0bfff7d18498b3c00eacd6a8fbf018"} Mar 19 00:18:16 crc kubenswrapper[4745]: I0319 00:18:16.084385 4745 scope.go:117] "RemoveContainer" containerID="de10a1d38d98124c46ca0c82dd88e28606d5f5ae568f904a533071011c2c4c10" Mar 19 00:18:34 crc kubenswrapper[4745]: I0319 00:18:34.631142 4745 scope.go:117] "RemoveContainer" containerID="5516b8a0a4bc7aafa493bd87254867dd7254eae5e71faee49575516dfd155284" Mar 19 00:18:34 crc kubenswrapper[4745]: I0319 00:18:34.671986 4745 scope.go:117] "RemoveContainer" containerID="76cb550291e5cd7aae935eea1a8dd025dfbc6f11748c2597964f9ad53d8ac6b0" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.079016 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w2988"] Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.086138 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="nbdb" containerID="cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464" gracePeriod=30 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.086993 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="sbdb" containerID="cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16" gracePeriod=30 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.087363 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055" gracePeriod=30 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.087469 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="northd" containerID="cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719" gracePeriod=30 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.087551 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="kube-rbac-proxy-node" containerID="cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6" gracePeriod=30 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.087599 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovn-acl-logging" containerID="cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66" gracePeriod=30 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.087744 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovn-controller" containerID="cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf" gracePeriod=30 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.158356 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" containerID="cri-o://878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec" gracePeriod=30 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.455656 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/3.log" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.459023 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovn-acl-logging/0.log" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.459552 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovn-controller/0.log" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.460368 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.509495 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mlwp7_6a0ae9c0-f19a-4038-be03-0fa6d223ebbf/kube-multus/2.log" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.510224 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mlwp7_6a0ae9c0-f19a-4038-be03-0fa6d223ebbf/kube-multus/1.log" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.510255 4745 generic.go:334] "Generic (PLEG): container finished" podID="6a0ae9c0-f19a-4038-be03-0fa6d223ebbf" containerID="24824f54f5d7906d7ff9e415522e7a824bf14a0a7fddbb60e7b205d77b6a0be8" exitCode=2 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.510316 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mlwp7" event={"ID":"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf","Type":"ContainerDied","Data":"24824f54f5d7906d7ff9e415522e7a824bf14a0a7fddbb60e7b205d77b6a0be8"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.510362 4745 scope.go:117] "RemoveContainer" containerID="486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.510942 4745 scope.go:117] "RemoveContainer" containerID="24824f54f5d7906d7ff9e415522e7a824bf14a0a7fddbb60e7b205d77b6a0be8" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.511243 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-mlwp7_openshift-multus(6a0ae9c0-f19a-4038-be03-0fa6d223ebbf)\"" pod="openshift-multus/multus-mlwp7" podUID="6a0ae9c0-f19a-4038-be03-0fa6d223ebbf" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519191 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-64vfm"] Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519537 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519560 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519577 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovn-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519588 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovn-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519603 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="kubecfg-setup" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519611 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="kubecfg-setup" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519627 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="northd" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519635 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="northd" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519645 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7807a7d0-ff52-4a76-b083-19eca144b510" containerName="oc" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519656 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7807a7d0-ff52-4a76-b083-19eca144b510" containerName="oc" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519667 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519676 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519686 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="kube-rbac-proxy-node" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519696 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="kube-rbac-proxy-node" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519705 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519714 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519728 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519736 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519745 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="nbdb" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519753 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="nbdb" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519772 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovn-acl-logging" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519781 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovn-acl-logging" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519798 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="sbdb" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519806 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="sbdb" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519957 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="7807a7d0-ff52-4a76-b083-19eca144b510" containerName="oc" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519973 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="northd" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519984 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519994 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="sbdb" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520006 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovn-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520017 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520029 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="nbdb" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520039 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520048 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520060 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovn-acl-logging" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520071 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="kube-rbac-proxy-node" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.520197 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520206 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520323 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520337 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.520459 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520470 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.522653 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.523209 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/3.log" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.527796 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovn-acl-logging/0.log" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.528369 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovn-controller/0.log" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529071 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec" exitCode=0 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529118 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16" exitCode=0 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529128 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464" exitCode=0 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529142 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719" exitCode=0 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529153 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055" exitCode=0 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529162 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6" exitCode=0 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529171 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66" exitCode=143 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529181 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf" exitCode=143 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529211 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529225 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529250 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529271 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529294 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529318 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529332 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529370 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529433 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529443 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529471 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529483 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529490 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529498 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529506 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529516 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529522 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529533 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529546 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529555 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529566 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529573 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529579 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529588 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529596 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529606 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529614 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529622 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529635 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529649 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529659 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529667 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529674 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529682 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529689 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529696 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529704 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529711 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529719 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529730 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"71f66e30efa5016ee954a4cb19c576186a237cdc85750ce0837af353f57d6b56"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529743 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529751 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529759 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529766 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529772 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529780 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529856 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529869 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529910 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529921 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.567787 4745 scope.go:117] "RemoveContainer" containerID="878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568109 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21835778-c889-4031-b630-586c00f200f9-ovn-node-metrics-cert\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568179 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-systemd-units\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568207 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-etc-openvswitch\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568240 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-script-lib\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568269 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-openvswitch\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568305 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-systemd\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568338 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwglz\" (UniqueName: \"kubernetes.io/projected/21835778-c889-4031-b630-586c00f200f9-kube-api-access-kwglz\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568354 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-log-socket\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568371 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-netns\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568398 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-config\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568414 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-node-log\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568437 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-ovn\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568451 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-ovn-kubernetes\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568471 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-slash\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568508 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568525 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-bin\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568550 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-netd\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568573 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-kubelet\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568613 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-env-overrides\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568630 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-var-lib-openvswitch\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568896 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569090 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-node-log" (OuterVolumeSpecName: "node-log") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569120 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569401 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569440 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569467 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569474 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569510 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569523 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-slash" (OuterVolumeSpecName: "host-slash") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569537 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569555 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569588 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-log-socket" (OuterVolumeSpecName: "log-socket") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569579 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569958 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.570086 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.570139 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568949 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.574764 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21835778-c889-4031-b630-586c00f200f9-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.575588 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21835778-c889-4031-b630-586c00f200f9-kube-api-access-kwglz" (OuterVolumeSpecName: "kube-api-access-kwglz") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "kube-api-access-kwglz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.583969 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.588934 4745 scope.go:117] "RemoveContainer" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.607560 4745 scope.go:117] "RemoveContainer" containerID="1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.620777 4745 scope.go:117] "RemoveContainer" containerID="9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.638717 4745 scope.go:117] "RemoveContainer" containerID="b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.653631 4745 scope.go:117] "RemoveContainer" containerID="e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.666579 4745 scope.go:117] "RemoveContainer" containerID="f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.670508 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-run-systemd\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.670543 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-var-lib-openvswitch\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.670568 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdrvb\" (UniqueName: \"kubernetes.io/projected/c5d25893-8bce-46da-9806-2fde750d93d0-kube-api-access-xdrvb\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.670592 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-run-ovn\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.670611 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-run-netns\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.670725 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5d25893-8bce-46da-9806-2fde750d93d0-ovnkube-script-lib\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671013 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-log-socket\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671085 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-run-openvswitch\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671154 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-slash\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671183 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-cni-netd\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671207 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-run-ovn-kubernetes\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671235 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671270 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5d25893-8bce-46da-9806-2fde750d93d0-ovnkube-config\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671298 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-kubelet\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671337 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5d25893-8bce-46da-9806-2fde750d93d0-ovn-node-metrics-cert\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671358 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-systemd-units\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671383 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-cni-bin\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671416 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-node-log\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671596 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5d25893-8bce-46da-9806-2fde750d93d0-env-overrides\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671646 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-etc-openvswitch\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671827 4745 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671849 4745 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671860 4745 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671873 4745 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21835778-c889-4031-b630-586c00f200f9-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671904 4745 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671917 4745 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671933 4745 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671945 4745 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671959 4745 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671974 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwglz\" (UniqueName: \"kubernetes.io/projected/21835778-c889-4031-b630-586c00f200f9-kube-api-access-kwglz\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671987 4745 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-log-socket\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671999 4745 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.672011 4745 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.672023 4745 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-node-log\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.672034 4745 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.672045 4745 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.672058 4745 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-slash\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.672069 4745 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.672078 4745 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.672090 4745 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.679034 4745 scope.go:117] "RemoveContainer" containerID="6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.692628 4745 scope.go:117] "RemoveContainer" containerID="d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.708762 4745 scope.go:117] "RemoveContainer" containerID="33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.724121 4745 scope.go:117] "RemoveContainer" containerID="878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.724598 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec\": container with ID starting with 878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec not found: ID does not exist" containerID="878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.724631 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec"} err="failed to get container status \"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec\": rpc error: code = NotFound desc = could not find container \"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec\": container with ID starting with 878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.724654 4745 scope.go:117] "RemoveContainer" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.725053 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\": container with ID starting with 4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6 not found: ID does not exist" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.725109 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6"} err="failed to get container status \"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\": rpc error: code = NotFound desc = could not find container \"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\": container with ID starting with 4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.725150 4745 scope.go:117] "RemoveContainer" containerID="1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.725578 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\": container with ID starting with 1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16 not found: ID does not exist" containerID="1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.725612 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16"} err="failed to get container status \"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\": rpc error: code = NotFound desc = could not find container \"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\": container with ID starting with 1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.725632 4745 scope.go:117] "RemoveContainer" containerID="9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.725970 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\": container with ID starting with 9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464 not found: ID does not exist" containerID="9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.725991 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464"} err="failed to get container status \"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\": rpc error: code = NotFound desc = could not find container \"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\": container with ID starting with 9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.726008 4745 scope.go:117] "RemoveContainer" containerID="b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.726331 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\": container with ID starting with b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719 not found: ID does not exist" containerID="b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.726381 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719"} err="failed to get container status \"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\": rpc error: code = NotFound desc = could not find container \"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\": container with ID starting with b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.726412 4745 scope.go:117] "RemoveContainer" containerID="e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.726951 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\": container with ID starting with e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055 not found: ID does not exist" containerID="e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.726978 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055"} err="failed to get container status \"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\": rpc error: code = NotFound desc = could not find container \"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\": container with ID starting with e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.726999 4745 scope.go:117] "RemoveContainer" containerID="f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.727250 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\": container with ID starting with f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6 not found: ID does not exist" containerID="f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.727277 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6"} err="failed to get container status \"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\": rpc error: code = NotFound desc = could not find container \"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\": container with ID starting with f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.727291 4745 scope.go:117] "RemoveContainer" containerID="6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.727553 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\": container with ID starting with 6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66 not found: ID does not exist" containerID="6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.727587 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66"} err="failed to get container status \"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\": rpc error: code = NotFound desc = could not find container \"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\": container with ID starting with 6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.727611 4745 scope.go:117] "RemoveContainer" containerID="d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.727861 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\": container with ID starting with d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf not found: ID does not exist" containerID="d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.727948 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf"} err="failed to get container status \"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\": rpc error: code = NotFound desc = could not find container \"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\": container with ID starting with d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.727972 4745 scope.go:117] "RemoveContainer" containerID="33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.728268 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\": container with ID starting with 33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d not found: ID does not exist" containerID="33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.728296 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d"} err="failed to get container status \"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\": rpc error: code = NotFound desc = could not find container \"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\": container with ID starting with 33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.728315 4745 scope.go:117] "RemoveContainer" containerID="878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.728559 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec"} err="failed to get container status \"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec\": rpc error: code = NotFound desc = could not find container \"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec\": container with ID starting with 878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.728582 4745 scope.go:117] "RemoveContainer" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.728831 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6"} err="failed to get container status \"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\": rpc error: code = NotFound desc = could not find container \"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\": container with ID starting with 4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.728851 4745 scope.go:117] "RemoveContainer" containerID="1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.729192 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16"} err="failed to get container status \"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\": rpc error: code = NotFound desc = could not find container \"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\": container with ID starting with 1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.729219 4745 scope.go:117] "RemoveContainer" containerID="9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.729438 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464"} err="failed to get container status \"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\": rpc error: code = NotFound desc = could not find container \"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\": container with ID starting with 9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.729465 4745 scope.go:117] "RemoveContainer" containerID="b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.729728 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719"} err="failed to get container status \"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\": rpc error: code = NotFound desc = could not find container \"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\": container with ID starting with b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.729755 4745 scope.go:117] "RemoveContainer" containerID="e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.730038 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055"} err="failed to get container status \"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\": rpc error: code = NotFound desc = could not find container \"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\": container with ID starting with e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.730056 4745 scope.go:117] "RemoveContainer" containerID="f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.730230 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6"} err="failed to get container status \"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\": rpc error: code = NotFound desc = could not find container \"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\": container with ID starting with f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.730249 4745 scope.go:117] "RemoveContainer" containerID="6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.730535 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66"} err="failed to get container status \"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\": rpc error: code = NotFound desc = could not find container \"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\": container with ID starting with 6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.730552 4745 scope.go:117] "RemoveContainer" containerID="d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.730769 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf"} err="failed to get container status \"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\": rpc error: code = NotFound desc = could not find container \"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\": container with ID starting with d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.730797 4745 scope.go:117] "RemoveContainer" containerID="33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.731077 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d"} err="failed to get container status \"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\": rpc error: code = NotFound desc = could not find container \"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\": container with ID starting with 33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.731098 4745 scope.go:117] "RemoveContainer" containerID="878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.731641 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec"} err="failed to get container status \"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec\": rpc error: code = NotFound desc = could not find container \"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec\": container with ID starting with 878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.731677 4745 scope.go:117] "RemoveContainer" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.732015 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6"} err="failed to get container status \"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\": rpc error: code = NotFound desc = could not find container \"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\": container with ID starting with 4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.732040 4745 scope.go:117] "RemoveContainer" containerID="1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.732333 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16"} err="failed to get container status \"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\": rpc error: code = NotFound desc = could not find container \"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\": container with ID starting with 1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.732352 4745 scope.go:117] "RemoveContainer" containerID="9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.732598 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464"} err="failed to get container status \"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\": rpc error: code = NotFound desc = could not find container \"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\": container with ID starting with 9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.732629 4745 scope.go:117] "RemoveContainer" containerID="b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.732965 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719"} err="failed to get container status \"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\": rpc error: code = NotFound desc = could not find container \"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\": container with ID starting with b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.732994 4745 scope.go:117] "RemoveContainer" containerID="e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.733276 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055"} err="failed to get container status \"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\": rpc error: code = NotFound desc = could not find container \"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\": container with ID starting with e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.733302 4745 scope.go:117] "RemoveContainer" containerID="f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.733558 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6"} err="failed to get container status \"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\": rpc error: code = NotFound desc = could not find container \"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\": container with ID starting with f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.733590 4745 scope.go:117] "RemoveContainer" containerID="6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.734063 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66"} err="failed to get container status \"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\": rpc error: code = NotFound desc = could not find container \"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\": container with ID starting with 6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.734090 4745 scope.go:117] "RemoveContainer" containerID="d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.734346 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf"} err="failed to get container status \"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\": rpc error: code = NotFound desc = could not find container \"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\": container with ID starting with d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.734373 4745 scope.go:117] "RemoveContainer" containerID="33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.734646 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d"} err="failed to get container status \"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\": rpc error: code = NotFound desc = could not find container \"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\": container with ID starting with 33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.734676 4745 scope.go:117] "RemoveContainer" containerID="878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.734945 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec"} err="failed to get container status \"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec\": rpc error: code = NotFound desc = could not find container \"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec\": container with ID starting with 878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.734967 4745 scope.go:117] "RemoveContainer" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.735210 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6"} err="failed to get container status \"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\": rpc error: code = NotFound desc = could not find container \"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\": container with ID starting with 4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.735235 4745 scope.go:117] "RemoveContainer" containerID="1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.735585 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16"} err="failed to get container status \"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\": rpc error: code = NotFound desc = could not find container \"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\": container with ID starting with 1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.735614 4745 scope.go:117] "RemoveContainer" containerID="9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.735971 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464"} err="failed to get container status \"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\": rpc error: code = NotFound desc = could not find container \"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\": container with ID starting with 9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.735999 4745 scope.go:117] "RemoveContainer" containerID="b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.736324 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719"} err="failed to get container status \"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\": rpc error: code = NotFound desc = could not find container \"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\": container with ID starting with b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.736354 4745 scope.go:117] "RemoveContainer" containerID="e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.736625 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055"} err="failed to get container status \"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\": rpc error: code = NotFound desc = could not find container \"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\": container with ID starting with e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.736645 4745 scope.go:117] "RemoveContainer" containerID="f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.737061 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6"} err="failed to get container status \"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\": rpc error: code = NotFound desc = could not find container \"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\": container with ID starting with f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.737090 4745 scope.go:117] "RemoveContainer" containerID="6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.737424 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66"} err="failed to get container status \"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\": rpc error: code = NotFound desc = could not find container \"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\": container with ID starting with 6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.737446 4745 scope.go:117] "RemoveContainer" containerID="d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.737804 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf"} err="failed to get container status \"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\": rpc error: code = NotFound desc = could not find container \"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\": container with ID starting with d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.737821 4745 scope.go:117] "RemoveContainer" containerID="33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.738125 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d"} err="failed to get container status \"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\": rpc error: code = NotFound desc = could not find container \"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\": container with ID starting with 33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773039 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-run-systemd\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773112 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-var-lib-openvswitch\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773148 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdrvb\" (UniqueName: \"kubernetes.io/projected/c5d25893-8bce-46da-9806-2fde750d93d0-kube-api-access-xdrvb\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773152 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-run-systemd\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773188 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-run-ovn\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773222 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-run-netns\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773263 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5d25893-8bce-46da-9806-2fde750d93d0-ovnkube-script-lib\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773286 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-var-lib-openvswitch\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773297 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-log-socket\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773360 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-log-socket\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773376 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-run-ovn\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773418 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-run-openvswitch\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773434 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-run-netns\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773389 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-run-openvswitch\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773597 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-slash\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773658 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-cni-netd\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773701 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-run-ovn-kubernetes\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773762 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773768 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-cni-netd\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773768 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-slash\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773827 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773844 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-run-ovn-kubernetes\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773857 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5d25893-8bce-46da-9806-2fde750d93d0-ovnkube-config\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.774329 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5d25893-8bce-46da-9806-2fde750d93d0-ovnkube-script-lib\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.774873 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-kubelet\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.774947 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-kubelet\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775032 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5d25893-8bce-46da-9806-2fde750d93d0-ovn-node-metrics-cert\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775069 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-systemd-units\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775106 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-cni-bin\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775184 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-node-log\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775256 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5d25893-8bce-46da-9806-2fde750d93d0-ovnkube-config\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775271 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5d25893-8bce-46da-9806-2fde750d93d0-env-overrides\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775376 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-etc-openvswitch\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775553 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-etc-openvswitch\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775561 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-cni-bin\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775586 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-systemd-units\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775605 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-node-log\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775993 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5d25893-8bce-46da-9806-2fde750d93d0-env-overrides\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.779168 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5d25893-8bce-46da-9806-2fde750d93d0-ovn-node-metrics-cert\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.798187 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdrvb\" (UniqueName: \"kubernetes.io/projected/c5d25893-8bce-46da-9806-2fde750d93d0-kube-api-access-xdrvb\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.855331 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.874054 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w2988"] Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.878680 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w2988"] Mar 19 00:19:20 crc kubenswrapper[4745]: I0319 00:19:20.145211 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21835778-c889-4031-b630-586c00f200f9" path="/var/lib/kubelet/pods/21835778-c889-4031-b630-586c00f200f9/volumes" Mar 19 00:19:20 crc kubenswrapper[4745]: I0319 00:19:20.536963 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mlwp7_6a0ae9c0-f19a-4038-be03-0fa6d223ebbf/kube-multus/2.log" Mar 19 00:19:20 crc kubenswrapper[4745]: I0319 00:19:20.540050 4745 generic.go:334] "Generic (PLEG): container finished" podID="c5d25893-8bce-46da-9806-2fde750d93d0" containerID="33ddc9ac56241f7dea8fc16946fde7802445ed2eda1d4a9745bd4ad269bdd58e" exitCode=0 Mar 19 00:19:20 crc kubenswrapper[4745]: I0319 00:19:20.540098 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" event={"ID":"c5d25893-8bce-46da-9806-2fde750d93d0","Type":"ContainerDied","Data":"33ddc9ac56241f7dea8fc16946fde7802445ed2eda1d4a9745bd4ad269bdd58e"} Mar 19 00:19:20 crc kubenswrapper[4745]: I0319 00:19:20.540128 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" event={"ID":"c5d25893-8bce-46da-9806-2fde750d93d0","Type":"ContainerStarted","Data":"b1c2f82361ff0cd2cae3b6f6fc64c915407d8c35cb4778248121cd1f76552426"} Mar 19 00:19:21 crc kubenswrapper[4745]: I0319 00:19:21.550799 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" event={"ID":"c5d25893-8bce-46da-9806-2fde750d93d0","Type":"ContainerStarted","Data":"18e63ceecbccfa8df09defdfecaae77390bb55d5bda8680dba4b7dbeab634af3"} Mar 19 00:19:21 crc kubenswrapper[4745]: I0319 00:19:21.551272 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" event={"ID":"c5d25893-8bce-46da-9806-2fde750d93d0","Type":"ContainerStarted","Data":"402da4f74f043885d98933315cb68889e3d5f573bfe4e1da270e0f4df467536a"} Mar 19 00:19:21 crc kubenswrapper[4745]: I0319 00:19:21.551285 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" event={"ID":"c5d25893-8bce-46da-9806-2fde750d93d0","Type":"ContainerStarted","Data":"a58976e2141d0d0bf167da8229880b6e30f04b789dd285f4253a8f573f98e121"} Mar 19 00:19:21 crc kubenswrapper[4745]: I0319 00:19:21.551296 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" event={"ID":"c5d25893-8bce-46da-9806-2fde750d93d0","Type":"ContainerStarted","Data":"b0a673b932ca8a42026854f32cea1c4dc9c2c7a512f560e493085f95531dc135"} Mar 19 00:19:21 crc kubenswrapper[4745]: I0319 00:19:21.551306 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" event={"ID":"c5d25893-8bce-46da-9806-2fde750d93d0","Type":"ContainerStarted","Data":"4762d3a9f47e16b83eea535260293517472f549a8c14be6e95319819ab20e5bf"} Mar 19 00:19:21 crc kubenswrapper[4745]: I0319 00:19:21.551318 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" event={"ID":"c5d25893-8bce-46da-9806-2fde750d93d0","Type":"ContainerStarted","Data":"073eafa1e9157645b3f26f03b633f63ad809b9c7ae8bebb379a2f8f8310068f2"} Mar 19 00:19:24 crc kubenswrapper[4745]: I0319 00:19:24.573518 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" event={"ID":"c5d25893-8bce-46da-9806-2fde750d93d0","Type":"ContainerStarted","Data":"253414df0b83c02260ceae77d62dbdd5110b2c2b1345f5e00f05edfd31e4b1ae"} Mar 19 00:19:26 crc kubenswrapper[4745]: I0319 00:19:26.587784 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" event={"ID":"c5d25893-8bce-46da-9806-2fde750d93d0","Type":"ContainerStarted","Data":"23bb1d1d3c3549c8bf7d4873d7650e19106674fdc6ed047085fd8ff8a4315b00"} Mar 19 00:19:26 crc kubenswrapper[4745]: I0319 00:19:26.588752 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:26 crc kubenswrapper[4745]: I0319 00:19:26.588845 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:26 crc kubenswrapper[4745]: I0319 00:19:26.588941 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:26 crc kubenswrapper[4745]: I0319 00:19:26.619315 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:26 crc kubenswrapper[4745]: I0319 00:19:26.620706 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:26 crc kubenswrapper[4745]: I0319 00:19:26.629740 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" podStartSLOduration=7.629716723 podStartE2EDuration="7.629716723s" podCreationTimestamp="2026-03-19 00:19:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:19:26.62506819 +0000 UTC m=+731.163263331" watchObservedRunningTime="2026-03-19 00:19:26.629716723 +0000 UTC m=+731.167911854" Mar 19 00:19:31 crc kubenswrapper[4745]: I0319 00:19:31.138449 4745 scope.go:117] "RemoveContainer" containerID="24824f54f5d7906d7ff9e415522e7a824bf14a0a7fddbb60e7b205d77b6a0be8" Mar 19 00:19:31 crc kubenswrapper[4745]: E0319 00:19:31.139155 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-mlwp7_openshift-multus(6a0ae9c0-f19a-4038-be03-0fa6d223ebbf)\"" pod="openshift-multus/multus-mlwp7" podUID="6a0ae9c0-f19a-4038-be03-0fa6d223ebbf" Mar 19 00:19:44 crc kubenswrapper[4745]: I0319 00:19:44.138616 4745 scope.go:117] "RemoveContainer" containerID="24824f54f5d7906d7ff9e415522e7a824bf14a0a7fddbb60e7b205d77b6a0be8" Mar 19 00:19:44 crc kubenswrapper[4745]: I0319 00:19:44.705938 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mlwp7_6a0ae9c0-f19a-4038-be03-0fa6d223ebbf/kube-multus/2.log" Mar 19 00:19:44 crc kubenswrapper[4745]: I0319 00:19:44.706604 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mlwp7" event={"ID":"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf","Type":"ContainerStarted","Data":"3e39454c14fb4170d640e610a3b7a97f6bb1459a50ea1fd4e65e6ac76d961ee5"} Mar 19 00:19:49 crc kubenswrapper[4745]: I0319 00:19:49.877699 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.134114 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564660-t5gfq"] Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.135833 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564660-t5gfq" Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.138333 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.138487 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.145667 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.148058 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564660-t5gfq"] Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.225196 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npkzr\" (UniqueName: \"kubernetes.io/projected/559d4ca4-399c-4504-8358-69d88bfdaf3a-kube-api-access-npkzr\") pod \"auto-csr-approver-29564660-t5gfq\" (UID: \"559d4ca4-399c-4504-8358-69d88bfdaf3a\") " pod="openshift-infra/auto-csr-approver-29564660-t5gfq" Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.325769 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npkzr\" (UniqueName: \"kubernetes.io/projected/559d4ca4-399c-4504-8358-69d88bfdaf3a-kube-api-access-npkzr\") pod \"auto-csr-approver-29564660-t5gfq\" (UID: \"559d4ca4-399c-4504-8358-69d88bfdaf3a\") " pod="openshift-infra/auto-csr-approver-29564660-t5gfq" Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.348366 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npkzr\" (UniqueName: \"kubernetes.io/projected/559d4ca4-399c-4504-8358-69d88bfdaf3a-kube-api-access-npkzr\") pod \"auto-csr-approver-29564660-t5gfq\" (UID: \"559d4ca4-399c-4504-8358-69d88bfdaf3a\") " pod="openshift-infra/auto-csr-approver-29564660-t5gfq" Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.458055 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564660-t5gfq" Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.684918 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564660-t5gfq"] Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.803249 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564660-t5gfq" event={"ID":"559d4ca4-399c-4504-8358-69d88bfdaf3a","Type":"ContainerStarted","Data":"e1365df5e61031fa7137d4e234b2937f4b37148d8b6ecce9356c1b4bd9b25ca6"} Mar 19 00:20:02 crc kubenswrapper[4745]: I0319 00:20:02.816053 4745 generic.go:334] "Generic (PLEG): container finished" podID="559d4ca4-399c-4504-8358-69d88bfdaf3a" containerID="1663b5c8bcd4ae3a664653728fe6c21020e126b8db8f2cf94f1cfba9c6c7bbc2" exitCode=0 Mar 19 00:20:02 crc kubenswrapper[4745]: I0319 00:20:02.816132 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564660-t5gfq" event={"ID":"559d4ca4-399c-4504-8358-69d88bfdaf3a","Type":"ContainerDied","Data":"1663b5c8bcd4ae3a664653728fe6c21020e126b8db8f2cf94f1cfba9c6c7bbc2"} Mar 19 00:20:04 crc kubenswrapper[4745]: I0319 00:20:04.053515 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564660-t5gfq" Mar 19 00:20:04 crc kubenswrapper[4745]: I0319 00:20:04.177611 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npkzr\" (UniqueName: \"kubernetes.io/projected/559d4ca4-399c-4504-8358-69d88bfdaf3a-kube-api-access-npkzr\") pod \"559d4ca4-399c-4504-8358-69d88bfdaf3a\" (UID: \"559d4ca4-399c-4504-8358-69d88bfdaf3a\") " Mar 19 00:20:04 crc kubenswrapper[4745]: I0319 00:20:04.183677 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/559d4ca4-399c-4504-8358-69d88bfdaf3a-kube-api-access-npkzr" (OuterVolumeSpecName: "kube-api-access-npkzr") pod "559d4ca4-399c-4504-8358-69d88bfdaf3a" (UID: "559d4ca4-399c-4504-8358-69d88bfdaf3a"). InnerVolumeSpecName "kube-api-access-npkzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:20:04 crc kubenswrapper[4745]: I0319 00:20:04.279066 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npkzr\" (UniqueName: \"kubernetes.io/projected/559d4ca4-399c-4504-8358-69d88bfdaf3a-kube-api-access-npkzr\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:04 crc kubenswrapper[4745]: I0319 00:20:04.829501 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564660-t5gfq" event={"ID":"559d4ca4-399c-4504-8358-69d88bfdaf3a","Type":"ContainerDied","Data":"e1365df5e61031fa7137d4e234b2937f4b37148d8b6ecce9356c1b4bd9b25ca6"} Mar 19 00:20:04 crc kubenswrapper[4745]: I0319 00:20:04.829568 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1365df5e61031fa7137d4e234b2937f4b37148d8b6ecce9356c1b4bd9b25ca6" Mar 19 00:20:04 crc kubenswrapper[4745]: I0319 00:20:04.829570 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564660-t5gfq" Mar 19 00:20:05 crc kubenswrapper[4745]: I0319 00:20:05.117580 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564654-j2b7b"] Mar 19 00:20:05 crc kubenswrapper[4745]: I0319 00:20:05.123207 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564654-j2b7b"] Mar 19 00:20:06 crc kubenswrapper[4745]: I0319 00:20:06.145610 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7689b2b-3fcb-4122-bb50-fb8215cdb08b" path="/var/lib/kubelet/pods/f7689b2b-3fcb-4122-bb50-fb8215cdb08b/volumes" Mar 19 00:20:15 crc kubenswrapper[4745]: I0319 00:20:15.606730 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:20:15 crc kubenswrapper[4745]: I0319 00:20:15.608732 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.411840 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j587v"] Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.412948 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j587v" podUID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerName="registry-server" containerID="cri-o://60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a" gracePeriod=30 Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.771129 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.891679 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-utilities\") pod \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.892580 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-catalog-content\") pod \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.892780 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmdx8\" (UniqueName: \"kubernetes.io/projected/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-kube-api-access-fmdx8\") pod \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.892788 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-utilities" (OuterVolumeSpecName: "utilities") pod "0dcd96ee-b500-4027-8a29-f0d6f59ea06b" (UID: "0dcd96ee-b500-4027-8a29-f0d6f59ea06b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.902249 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-kube-api-access-fmdx8" (OuterVolumeSpecName: "kube-api-access-fmdx8") pod "0dcd96ee-b500-4027-8a29-f0d6f59ea06b" (UID: "0dcd96ee-b500-4027-8a29-f0d6f59ea06b"). InnerVolumeSpecName "kube-api-access-fmdx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.922741 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0dcd96ee-b500-4027-8a29-f0d6f59ea06b" (UID: "0dcd96ee-b500-4027-8a29-f0d6f59ea06b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.978800 4745 generic.go:334] "Generic (PLEG): container finished" podID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerID="60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a" exitCode=0 Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.979136 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.978896 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j587v" event={"ID":"0dcd96ee-b500-4027-8a29-f0d6f59ea06b","Type":"ContainerDied","Data":"60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a"} Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.979439 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j587v" event={"ID":"0dcd96ee-b500-4027-8a29-f0d6f59ea06b","Type":"ContainerDied","Data":"5e90ccfe8369ff143297fb00f49862c706dc2eb6a69c3dc1f5670ef331a15a02"} Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.979558 4745 scope.go:117] "RemoveContainer" containerID="60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a" Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.993873 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.994170 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmdx8\" (UniqueName: \"kubernetes.io/projected/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-kube-api-access-fmdx8\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.994259 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:25 crc kubenswrapper[4745]: I0319 00:20:25.003454 4745 scope.go:117] "RemoveContainer" containerID="a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c" Mar 19 00:20:25 crc kubenswrapper[4745]: I0319 00:20:25.031812 4745 scope.go:117] "RemoveContainer" containerID="d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae" Mar 19 00:20:25 crc kubenswrapper[4745]: I0319 00:20:25.034212 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j587v"] Mar 19 00:20:25 crc kubenswrapper[4745]: I0319 00:20:25.040296 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j587v"] Mar 19 00:20:25 crc kubenswrapper[4745]: I0319 00:20:25.062004 4745 scope.go:117] "RemoveContainer" containerID="60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a" Mar 19 00:20:25 crc kubenswrapper[4745]: E0319 00:20:25.062749 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a\": container with ID starting with 60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a not found: ID does not exist" containerID="60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a" Mar 19 00:20:25 crc kubenswrapper[4745]: I0319 00:20:25.062861 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a"} err="failed to get container status \"60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a\": rpc error: code = NotFound desc = could not find container \"60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a\": container with ID starting with 60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a not found: ID does not exist" Mar 19 00:20:25 crc kubenswrapper[4745]: I0319 00:20:25.062957 4745 scope.go:117] "RemoveContainer" containerID="a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c" Mar 19 00:20:25 crc kubenswrapper[4745]: E0319 00:20:25.063524 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c\": container with ID starting with a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c not found: ID does not exist" containerID="a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c" Mar 19 00:20:25 crc kubenswrapper[4745]: I0319 00:20:25.063633 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c"} err="failed to get container status \"a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c\": rpc error: code = NotFound desc = could not find container \"a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c\": container with ID starting with a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c not found: ID does not exist" Mar 19 00:20:25 crc kubenswrapper[4745]: I0319 00:20:25.063732 4745 scope.go:117] "RemoveContainer" containerID="d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae" Mar 19 00:20:25 crc kubenswrapper[4745]: E0319 00:20:25.064232 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae\": container with ID starting with d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae not found: ID does not exist" containerID="d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae" Mar 19 00:20:25 crc kubenswrapper[4745]: I0319 00:20:25.064327 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae"} err="failed to get container status \"d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae\": rpc error: code = NotFound desc = could not find container \"d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae\": container with ID starting with d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae not found: ID does not exist" Mar 19 00:20:26 crc kubenswrapper[4745]: I0319 00:20:26.145168 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" path="/var/lib/kubelet/pods/0dcd96ee-b500-4027-8a29-f0d6f59ea06b/volumes" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.506314 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d"] Mar 19 00:20:28 crc kubenswrapper[4745]: E0319 00:20:28.506991 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerName="registry-server" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.507006 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerName="registry-server" Mar 19 00:20:28 crc kubenswrapper[4745]: E0319 00:20:28.507021 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerName="extract-utilities" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.507028 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerName="extract-utilities" Mar 19 00:20:28 crc kubenswrapper[4745]: E0319 00:20:28.507038 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559d4ca4-399c-4504-8358-69d88bfdaf3a" containerName="oc" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.507047 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="559d4ca4-399c-4504-8358-69d88bfdaf3a" containerName="oc" Mar 19 00:20:28 crc kubenswrapper[4745]: E0319 00:20:28.507070 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerName="extract-content" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.507075 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerName="extract-content" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.507173 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="559d4ca4-399c-4504-8358-69d88bfdaf3a" containerName="oc" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.507182 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerName="registry-server" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.507952 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.511684 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.520465 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d"] Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.547703 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.547761 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.547869 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxct9\" (UniqueName: \"kubernetes.io/projected/6e2c290f-c398-4c6e-9dec-82038e0bda08-kube-api-access-cxct9\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.651454 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.651527 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.651614 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxct9\" (UniqueName: \"kubernetes.io/projected/6e2c290f-c398-4c6e-9dec-82038e0bda08-kube-api-access-cxct9\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.652215 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.652615 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.679353 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxct9\" (UniqueName: \"kubernetes.io/projected/6e2c290f-c398-4c6e-9dec-82038e0bda08-kube-api-access-cxct9\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.861027 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:29 crc kubenswrapper[4745]: I0319 00:20:29.064584 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d"] Mar 19 00:20:30 crc kubenswrapper[4745]: I0319 00:20:30.014195 4745 generic.go:334] "Generic (PLEG): container finished" podID="6e2c290f-c398-4c6e-9dec-82038e0bda08" containerID="0465dcfd21fde34a50f161d1cde6579abf629fb152d29ad2a3fdb5dacb7957d2" exitCode=0 Mar 19 00:20:30 crc kubenswrapper[4745]: I0319 00:20:30.014271 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" event={"ID":"6e2c290f-c398-4c6e-9dec-82038e0bda08","Type":"ContainerDied","Data":"0465dcfd21fde34a50f161d1cde6579abf629fb152d29ad2a3fdb5dacb7957d2"} Mar 19 00:20:30 crc kubenswrapper[4745]: I0319 00:20:30.014705 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" event={"ID":"6e2c290f-c398-4c6e-9dec-82038e0bda08","Type":"ContainerStarted","Data":"2223840d46c198dcf1ff6d4a9faf641dfd5931970f8a283d0f4aab34a9da5a6a"} Mar 19 00:20:32 crc kubenswrapper[4745]: I0319 00:20:32.033112 4745 generic.go:334] "Generic (PLEG): container finished" podID="6e2c290f-c398-4c6e-9dec-82038e0bda08" containerID="ef1ea707f8bf9df1e84b40963515384be179344be9dd602a8d23878a7e5524b6" exitCode=0 Mar 19 00:20:32 crc kubenswrapper[4745]: I0319 00:20:32.033216 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" event={"ID":"6e2c290f-c398-4c6e-9dec-82038e0bda08","Type":"ContainerDied","Data":"ef1ea707f8bf9df1e84b40963515384be179344be9dd602a8d23878a7e5524b6"} Mar 19 00:20:33 crc kubenswrapper[4745]: I0319 00:20:33.042315 4745 generic.go:334] "Generic (PLEG): container finished" podID="6e2c290f-c398-4c6e-9dec-82038e0bda08" containerID="f5380b52bacf8a1042f9633d7a52226818469c07b6fd057be0b678d8b9909f81" exitCode=0 Mar 19 00:20:33 crc kubenswrapper[4745]: I0319 00:20:33.042374 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" event={"ID":"6e2c290f-c398-4c6e-9dec-82038e0bda08","Type":"ContainerDied","Data":"f5380b52bacf8a1042f9633d7a52226818469c07b6fd057be0b678d8b9909f81"} Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.304561 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892"] Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.306382 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.319968 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892"] Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.340832 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.340955 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.340999 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fvzm\" (UniqueName: \"kubernetes.io/projected/8baea7f9-4007-48cc-a849-7b8ce10c526b-kube-api-access-5fvzm\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.397142 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.441921 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-util\") pod \"6e2c290f-c398-4c6e-9dec-82038e0bda08\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.442046 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxct9\" (UniqueName: \"kubernetes.io/projected/6e2c290f-c398-4c6e-9dec-82038e0bda08-kube-api-access-cxct9\") pod \"6e2c290f-c398-4c6e-9dec-82038e0bda08\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.442167 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-bundle\") pod \"6e2c290f-c398-4c6e-9dec-82038e0bda08\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.442413 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.442464 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fvzm\" (UniqueName: \"kubernetes.io/projected/8baea7f9-4007-48cc-a849-7b8ce10c526b-kube-api-access-5fvzm\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.442498 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.443257 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.443248 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.446009 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-bundle" (OuterVolumeSpecName: "bundle") pod "6e2c290f-c398-4c6e-9dec-82038e0bda08" (UID: "6e2c290f-c398-4c6e-9dec-82038e0bda08"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.450601 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e2c290f-c398-4c6e-9dec-82038e0bda08-kube-api-access-cxct9" (OuterVolumeSpecName: "kube-api-access-cxct9") pod "6e2c290f-c398-4c6e-9dec-82038e0bda08" (UID: "6e2c290f-c398-4c6e-9dec-82038e0bda08"). InnerVolumeSpecName "kube-api-access-cxct9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.466007 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fvzm\" (UniqueName: \"kubernetes.io/projected/8baea7f9-4007-48cc-a849-7b8ce10c526b-kube-api-access-5fvzm\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.469982 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-util" (OuterVolumeSpecName: "util") pod "6e2c290f-c398-4c6e-9dec-82038e0bda08" (UID: "6e2c290f-c398-4c6e-9dec-82038e0bda08"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.543780 4745 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-util\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.543827 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxct9\" (UniqueName: \"kubernetes.io/projected/6e2c290f-c398-4c6e-9dec-82038e0bda08-kube-api-access-cxct9\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.543839 4745 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.632252 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.760194 4745 scope.go:117] "RemoveContainer" containerID="d0b7bf2fb29c7b89c86195effcef47a72d6e88d2457a53a6804ca521616f6ee6" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.858296 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892"] Mar 19 00:20:34 crc kubenswrapper[4745]: W0319 00:20:34.867811 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8baea7f9_4007_48cc_a849_7b8ce10c526b.slice/crio-c2cdb0635f7733f7f9b82cfbee3210120b13039355d2311046b243f6b900dc90 WatchSource:0}: Error finding container c2cdb0635f7733f7f9b82cfbee3210120b13039355d2311046b243f6b900dc90: Status 404 returned error can't find the container with id c2cdb0635f7733f7f9b82cfbee3210120b13039355d2311046b243f6b900dc90 Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.058354 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" event={"ID":"8baea7f9-4007-48cc-a849-7b8ce10c526b","Type":"ContainerStarted","Data":"c1a0807a6be939d584d1b6f01c689a15037cfa9127acd29591944dcc243a8195"} Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.058684 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" event={"ID":"8baea7f9-4007-48cc-a849-7b8ce10c526b","Type":"ContainerStarted","Data":"c2cdb0635f7733f7f9b82cfbee3210120b13039355d2311046b243f6b900dc90"} Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.061809 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" event={"ID":"6e2c290f-c398-4c6e-9dec-82038e0bda08","Type":"ContainerDied","Data":"2223840d46c198dcf1ff6d4a9faf641dfd5931970f8a283d0f4aab34a9da5a6a"} Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.061837 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2223840d46c198dcf1ff6d4a9faf641dfd5931970f8a283d0f4aab34a9da5a6a" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.061947 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.313584 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2"] Mar 19 00:20:35 crc kubenswrapper[4745]: E0319 00:20:35.313963 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2c290f-c398-4c6e-9dec-82038e0bda08" containerName="util" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.313983 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2c290f-c398-4c6e-9dec-82038e0bda08" containerName="util" Mar 19 00:20:35 crc kubenswrapper[4745]: E0319 00:20:35.314009 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2c290f-c398-4c6e-9dec-82038e0bda08" containerName="extract" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.314016 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2c290f-c398-4c6e-9dec-82038e0bda08" containerName="extract" Mar 19 00:20:35 crc kubenswrapper[4745]: E0319 00:20:35.314031 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2c290f-c398-4c6e-9dec-82038e0bda08" containerName="pull" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.314040 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2c290f-c398-4c6e-9dec-82038e0bda08" containerName="pull" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.314136 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e2c290f-c398-4c6e-9dec-82038e0bda08" containerName="extract" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.319850 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.326001 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2"] Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.355070 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfxlz\" (UniqueName: \"kubernetes.io/projected/9ef9c16b-de5e-456d-899c-15bdcfba6c89-kube-api-access-jfxlz\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.355207 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.355243 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.456517 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.456590 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.456616 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfxlz\" (UniqueName: \"kubernetes.io/projected/9ef9c16b-de5e-456d-899c-15bdcfba6c89-kube-api-access-jfxlz\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.457437 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.457624 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.480028 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfxlz\" (UniqueName: \"kubernetes.io/projected/9ef9c16b-de5e-456d-899c-15bdcfba6c89-kube-api-access-jfxlz\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.635422 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.887963 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2"] Mar 19 00:20:35 crc kubenswrapper[4745]: W0319 00:20:35.896033 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ef9c16b_de5e_456d_899c_15bdcfba6c89.slice/crio-d2395403789e88ce0a8f9d4fe105492924c1349886630b850d4425403bb46113 WatchSource:0}: Error finding container d2395403789e88ce0a8f9d4fe105492924c1349886630b850d4425403bb46113: Status 404 returned error can't find the container with id d2395403789e88ce0a8f9d4fe105492924c1349886630b850d4425403bb46113 Mar 19 00:20:36 crc kubenswrapper[4745]: I0319 00:20:36.068801 4745 generic.go:334] "Generic (PLEG): container finished" podID="8baea7f9-4007-48cc-a849-7b8ce10c526b" containerID="c1a0807a6be939d584d1b6f01c689a15037cfa9127acd29591944dcc243a8195" exitCode=0 Mar 19 00:20:36 crc kubenswrapper[4745]: I0319 00:20:36.068937 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" event={"ID":"8baea7f9-4007-48cc-a849-7b8ce10c526b","Type":"ContainerDied","Data":"c1a0807a6be939d584d1b6f01c689a15037cfa9127acd29591944dcc243a8195"} Mar 19 00:20:36 crc kubenswrapper[4745]: I0319 00:20:36.071698 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" event={"ID":"9ef9c16b-de5e-456d-899c-15bdcfba6c89","Type":"ContainerStarted","Data":"87bb193744374a4892e61e2db169e197782d1eba0014818900e955f51aefbdfe"} Mar 19 00:20:36 crc kubenswrapper[4745]: I0319 00:20:36.071749 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" event={"ID":"9ef9c16b-de5e-456d-899c-15bdcfba6c89","Type":"ContainerStarted","Data":"d2395403789e88ce0a8f9d4fe105492924c1349886630b850d4425403bb46113"} Mar 19 00:20:37 crc kubenswrapper[4745]: I0319 00:20:37.081176 4745 generic.go:334] "Generic (PLEG): container finished" podID="9ef9c16b-de5e-456d-899c-15bdcfba6c89" containerID="87bb193744374a4892e61e2db169e197782d1eba0014818900e955f51aefbdfe" exitCode=0 Mar 19 00:20:37 crc kubenswrapper[4745]: I0319 00:20:37.081298 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" event={"ID":"9ef9c16b-de5e-456d-899c-15bdcfba6c89","Type":"ContainerDied","Data":"87bb193744374a4892e61e2db169e197782d1eba0014818900e955f51aefbdfe"} Mar 19 00:20:37 crc kubenswrapper[4745]: I0319 00:20:37.085264 4745 generic.go:334] "Generic (PLEG): container finished" podID="8baea7f9-4007-48cc-a849-7b8ce10c526b" containerID="74c1fd116e1be56ae0ca1d679824c6061d578cdad159dd7bdee7cf2704e0e3f4" exitCode=0 Mar 19 00:20:37 crc kubenswrapper[4745]: I0319 00:20:37.085298 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" event={"ID":"8baea7f9-4007-48cc-a849-7b8ce10c526b","Type":"ContainerDied","Data":"74c1fd116e1be56ae0ca1d679824c6061d578cdad159dd7bdee7cf2704e0e3f4"} Mar 19 00:20:38 crc kubenswrapper[4745]: I0319 00:20:38.093707 4745 generic.go:334] "Generic (PLEG): container finished" podID="8baea7f9-4007-48cc-a849-7b8ce10c526b" containerID="6eb0b771b13168857799256a37aec560362ff2f87f67686d96e809d87ee9de8a" exitCode=0 Mar 19 00:20:38 crc kubenswrapper[4745]: I0319 00:20:38.093826 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" event={"ID":"8baea7f9-4007-48cc-a849-7b8ce10c526b","Type":"ContainerDied","Data":"6eb0b771b13168857799256a37aec560362ff2f87f67686d96e809d87ee9de8a"} Mar 19 00:20:39 crc kubenswrapper[4745]: I0319 00:20:39.511745 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:39 crc kubenswrapper[4745]: I0319 00:20:39.633415 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-util\") pod \"8baea7f9-4007-48cc-a849-7b8ce10c526b\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " Mar 19 00:20:39 crc kubenswrapper[4745]: I0319 00:20:39.633555 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-bundle\") pod \"8baea7f9-4007-48cc-a849-7b8ce10c526b\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " Mar 19 00:20:39 crc kubenswrapper[4745]: I0319 00:20:39.633585 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fvzm\" (UniqueName: \"kubernetes.io/projected/8baea7f9-4007-48cc-a849-7b8ce10c526b-kube-api-access-5fvzm\") pod \"8baea7f9-4007-48cc-a849-7b8ce10c526b\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " Mar 19 00:20:39 crc kubenswrapper[4745]: I0319 00:20:39.634508 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-bundle" (OuterVolumeSpecName: "bundle") pod "8baea7f9-4007-48cc-a849-7b8ce10c526b" (UID: "8baea7f9-4007-48cc-a849-7b8ce10c526b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:20:39 crc kubenswrapper[4745]: I0319 00:20:39.640858 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8baea7f9-4007-48cc-a849-7b8ce10c526b-kube-api-access-5fvzm" (OuterVolumeSpecName: "kube-api-access-5fvzm") pod "8baea7f9-4007-48cc-a849-7b8ce10c526b" (UID: "8baea7f9-4007-48cc-a849-7b8ce10c526b"). InnerVolumeSpecName "kube-api-access-5fvzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:20:39 crc kubenswrapper[4745]: I0319 00:20:39.653826 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-util" (OuterVolumeSpecName: "util") pod "8baea7f9-4007-48cc-a849-7b8ce10c526b" (UID: "8baea7f9-4007-48cc-a849-7b8ce10c526b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:20:39 crc kubenswrapper[4745]: I0319 00:20:39.735871 4745 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:39 crc kubenswrapper[4745]: I0319 00:20:39.735933 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fvzm\" (UniqueName: \"kubernetes.io/projected/8baea7f9-4007-48cc-a849-7b8ce10c526b-kube-api-access-5fvzm\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:39 crc kubenswrapper[4745]: I0319 00:20:39.735947 4745 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-util\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:40 crc kubenswrapper[4745]: I0319 00:20:40.205640 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" event={"ID":"8baea7f9-4007-48cc-a849-7b8ce10c526b","Type":"ContainerDied","Data":"c2cdb0635f7733f7f9b82cfbee3210120b13039355d2311046b243f6b900dc90"} Mar 19 00:20:40 crc kubenswrapper[4745]: I0319 00:20:40.205704 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2cdb0635f7733f7f9b82cfbee3210120b13039355d2311046b243f6b900dc90" Mar 19 00:20:40 crc kubenswrapper[4745]: I0319 00:20:40.205695 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:40 crc kubenswrapper[4745]: I0319 00:20:40.207841 4745 generic.go:334] "Generic (PLEG): container finished" podID="9ef9c16b-de5e-456d-899c-15bdcfba6c89" containerID="320635c6f3f06d3534da7620d1c32e51b07e2f3cfb8d016e53c4d638a8ff1591" exitCode=0 Mar 19 00:20:40 crc kubenswrapper[4745]: I0319 00:20:40.207915 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" event={"ID":"9ef9c16b-de5e-456d-899c-15bdcfba6c89","Type":"ContainerDied","Data":"320635c6f3f06d3534da7620d1c32e51b07e2f3cfb8d016e53c4d638a8ff1591"} Mar 19 00:20:41 crc kubenswrapper[4745]: I0319 00:20:41.216431 4745 generic.go:334] "Generic (PLEG): container finished" podID="9ef9c16b-de5e-456d-899c-15bdcfba6c89" containerID="cc6d2564161abb289a0cdfc21878f1353ec0ea6ab461cff0ac11a657e6894e3e" exitCode=0 Mar 19 00:20:41 crc kubenswrapper[4745]: I0319 00:20:41.216530 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" event={"ID":"9ef9c16b-de5e-456d-899c-15bdcfba6c89","Type":"ContainerDied","Data":"cc6d2564161abb289a0cdfc21878f1353ec0ea6ab461cff0ac11a657e6894e3e"} Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.063491 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.091641 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l"] Mar 19 00:20:43 crc kubenswrapper[4745]: E0319 00:20:43.091911 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8baea7f9-4007-48cc-a849-7b8ce10c526b" containerName="util" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.091925 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="8baea7f9-4007-48cc-a849-7b8ce10c526b" containerName="util" Mar 19 00:20:43 crc kubenswrapper[4745]: E0319 00:20:43.091937 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8baea7f9-4007-48cc-a849-7b8ce10c526b" containerName="extract" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.091944 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="8baea7f9-4007-48cc-a849-7b8ce10c526b" containerName="extract" Mar 19 00:20:43 crc kubenswrapper[4745]: E0319 00:20:43.091958 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef9c16b-de5e-456d-899c-15bdcfba6c89" containerName="pull" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.091966 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef9c16b-de5e-456d-899c-15bdcfba6c89" containerName="pull" Mar 19 00:20:43 crc kubenswrapper[4745]: E0319 00:20:43.091976 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef9c16b-de5e-456d-899c-15bdcfba6c89" containerName="extract" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.091982 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef9c16b-de5e-456d-899c-15bdcfba6c89" containerName="extract" Mar 19 00:20:43 crc kubenswrapper[4745]: E0319 00:20:43.091993 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8baea7f9-4007-48cc-a849-7b8ce10c526b" containerName="pull" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.092000 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="8baea7f9-4007-48cc-a849-7b8ce10c526b" containerName="pull" Mar 19 00:20:43 crc kubenswrapper[4745]: E0319 00:20:43.092014 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef9c16b-de5e-456d-899c-15bdcfba6c89" containerName="util" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.092021 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef9c16b-de5e-456d-899c-15bdcfba6c89" containerName="util" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.092107 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="8baea7f9-4007-48cc-a849-7b8ce10c526b" containerName="extract" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.092117 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef9c16b-de5e-456d-899c-15bdcfba6c89" containerName="extract" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.092963 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.109924 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l"] Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.253617 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-bundle\") pod \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.253762 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfxlz\" (UniqueName: \"kubernetes.io/projected/9ef9c16b-de5e-456d-899c-15bdcfba6c89-kube-api-access-jfxlz\") pod \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.253821 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-util\") pod \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.254078 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4z5l\" (UniqueName: \"kubernetes.io/projected/4ef969f2-d76b-405e-baaf-c10a36d36ed3-kube-api-access-m4z5l\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.254553 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.254666 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.255598 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-bundle" (OuterVolumeSpecName: "bundle") pod "9ef9c16b-de5e-456d-899c-15bdcfba6c89" (UID: "9ef9c16b-de5e-456d-899c-15bdcfba6c89"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.272263 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-util" (OuterVolumeSpecName: "util") pod "9ef9c16b-de5e-456d-899c-15bdcfba6c89" (UID: "9ef9c16b-de5e-456d-899c-15bdcfba6c89"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.272507 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef9c16b-de5e-456d-899c-15bdcfba6c89-kube-api-access-jfxlz" (OuterVolumeSpecName: "kube-api-access-jfxlz") pod "9ef9c16b-de5e-456d-899c-15bdcfba6c89" (UID: "9ef9c16b-de5e-456d-899c-15bdcfba6c89"). InnerVolumeSpecName "kube-api-access-jfxlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.278825 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" event={"ID":"9ef9c16b-de5e-456d-899c-15bdcfba6c89","Type":"ContainerDied","Data":"d2395403789e88ce0a8f9d4fe105492924c1349886630b850d4425403bb46113"} Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.278872 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2395403789e88ce0a8f9d4fe105492924c1349886630b850d4425403bb46113" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.278968 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.356079 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.356135 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.356189 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4z5l\" (UniqueName: \"kubernetes.io/projected/4ef969f2-d76b-405e-baaf-c10a36d36ed3-kube-api-access-m4z5l\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.356298 4745 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.356311 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfxlz\" (UniqueName: \"kubernetes.io/projected/9ef9c16b-de5e-456d-899c-15bdcfba6c89-kube-api-access-jfxlz\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.356321 4745 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-util\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.356934 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.357002 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.396367 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4z5l\" (UniqueName: \"kubernetes.io/projected/4ef969f2-d76b-405e-baaf-c10a36d36ed3-kube-api-access-m4z5l\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.407941 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:45 crc kubenswrapper[4745]: I0319 00:20:45.105859 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l"] Mar 19 00:20:45 crc kubenswrapper[4745]: I0319 00:20:45.301700 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" event={"ID":"4ef969f2-d76b-405e-baaf-c10a36d36ed3","Type":"ContainerStarted","Data":"3ef7a137321f30be497e8cc61b9023655954715f64b2bf473455bf5c69e5a016"} Mar 19 00:20:45 crc kubenswrapper[4745]: I0319 00:20:45.606169 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:20:45 crc kubenswrapper[4745]: I0319 00:20:45.606903 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:20:46 crc kubenswrapper[4745]: I0319 00:20:46.308863 4745 generic.go:334] "Generic (PLEG): container finished" podID="4ef969f2-d76b-405e-baaf-c10a36d36ed3" containerID="187af0133ef427170b8c6022e9a4d4d260d309d247b481838c77af900f2c83b6" exitCode=0 Mar 19 00:20:46 crc kubenswrapper[4745]: I0319 00:20:46.308919 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" event={"ID":"4ef969f2-d76b-405e-baaf-c10a36d36ed3","Type":"ContainerDied","Data":"187af0133ef427170b8c6022e9a4d4d260d309d247b481838c77af900f2c83b6"} Mar 19 00:20:47 crc kubenswrapper[4745]: I0319 00:20:47.751173 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm"] Mar 19 00:20:47 crc kubenswrapper[4745]: I0319 00:20:47.752230 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm" Mar 19 00:20:47 crc kubenswrapper[4745]: I0319 00:20:47.757649 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 19 00:20:47 crc kubenswrapper[4745]: I0319 00:20:47.757548 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 19 00:20:47 crc kubenswrapper[4745]: I0319 00:20:47.759434 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-zppsp" Mar 19 00:20:47 crc kubenswrapper[4745]: I0319 00:20:47.828092 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm"] Mar 19 00:20:47 crc kubenswrapper[4745]: I0319 00:20:47.897232 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6zt6\" (UniqueName: \"kubernetes.io/projected/bcf530c8-afe8-4a0e-9e5c-bfd85712e37a-kube-api-access-c6zt6\") pod \"obo-prometheus-operator-8ff7d675-tz2rm\" (UID: \"bcf530c8-afe8-4a0e-9e5c-bfd85712e37a\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm" Mar 19 00:20:47 crc kubenswrapper[4745]: I0319 00:20:47.998498 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6zt6\" (UniqueName: \"kubernetes.io/projected/bcf530c8-afe8-4a0e-9e5c-bfd85712e37a-kube-api-access-c6zt6\") pod \"obo-prometheus-operator-8ff7d675-tz2rm\" (UID: \"bcf530c8-afe8-4a0e-9e5c-bfd85712e37a\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.035664 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6zt6\" (UniqueName: \"kubernetes.io/projected/bcf530c8-afe8-4a0e-9e5c-bfd85712e37a-kube-api-access-c6zt6\") pod \"obo-prometheus-operator-8ff7d675-tz2rm\" (UID: \"bcf530c8-afe8-4a0e-9e5c-bfd85712e37a\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.068712 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.368926 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78"] Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.376242 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.384912 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-fbxzm" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.385825 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.396963 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78"] Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.398755 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2"] Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.399862 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.461580 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2"] Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.508303 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/69d14850-5c50-4c06-8581-2a70644c7de7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78\" (UID: \"69d14850-5c50-4c06-8581-2a70644c7de7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.508393 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5c4fe84-51cb-479a-a8cc-2e07bde21417-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2\" (UID: \"f5c4fe84-51cb-479a-a8cc-2e07bde21417\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.508452 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/69d14850-5c50-4c06-8581-2a70644c7de7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78\" (UID: \"69d14850-5c50-4c06-8581-2a70644c7de7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.508474 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5c4fe84-51cb-479a-a8cc-2e07bde21417-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2\" (UID: \"f5c4fe84-51cb-479a-a8cc-2e07bde21417\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.580135 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm"] Mar 19 00:20:48 crc kubenswrapper[4745]: W0319 00:20:48.599232 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcf530c8_afe8_4a0e_9e5c_bfd85712e37a.slice/crio-39c5408e3451690ad2f74d4ac71ce5fc5253d0be296d6c14aacd956916b99bd1 WatchSource:0}: Error finding container 39c5408e3451690ad2f74d4ac71ce5fc5253d0be296d6c14aacd956916b99bd1: Status 404 returned error can't find the container with id 39c5408e3451690ad2f74d4ac71ce5fc5253d0be296d6c14aacd956916b99bd1 Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.614998 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/69d14850-5c50-4c06-8581-2a70644c7de7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78\" (UID: \"69d14850-5c50-4c06-8581-2a70644c7de7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.615060 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5c4fe84-51cb-479a-a8cc-2e07bde21417-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2\" (UID: \"f5c4fe84-51cb-479a-a8cc-2e07bde21417\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.615101 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/69d14850-5c50-4c06-8581-2a70644c7de7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78\" (UID: \"69d14850-5c50-4c06-8581-2a70644c7de7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.615146 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5c4fe84-51cb-479a-a8cc-2e07bde21417-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2\" (UID: \"f5c4fe84-51cb-479a-a8cc-2e07bde21417\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.623272 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5c4fe84-51cb-479a-a8cc-2e07bde21417-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2\" (UID: \"f5c4fe84-51cb-479a-a8cc-2e07bde21417\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.628270 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/69d14850-5c50-4c06-8581-2a70644c7de7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78\" (UID: \"69d14850-5c50-4c06-8581-2a70644c7de7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.628859 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/69d14850-5c50-4c06-8581-2a70644c7de7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78\" (UID: \"69d14850-5c50-4c06-8581-2a70644c7de7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.630521 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5c4fe84-51cb-479a-a8cc-2e07bde21417-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2\" (UID: \"f5c4fe84-51cb-479a-a8cc-2e07bde21417\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.718825 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.726236 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.792537 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-shlz7"] Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.793374 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.795806 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-tdc6z" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.798348 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.852277 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-shlz7"] Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.931813 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsg5k\" (UniqueName: \"kubernetes.io/projected/a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534-kube-api-access-fsg5k\") pod \"observability-operator-6dd7dd855f-shlz7\" (UID: \"a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534\") " pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.932388 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-shlz7\" (UID: \"a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534\") " pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.034173 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsg5k\" (UniqueName: \"kubernetes.io/projected/a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534-kube-api-access-fsg5k\") pod \"observability-operator-6dd7dd855f-shlz7\" (UID: \"a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534\") " pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.034539 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-shlz7\" (UID: \"a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534\") " pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.040041 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-shlz7\" (UID: \"a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534\") " pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.073764 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsg5k\" (UniqueName: \"kubernetes.io/projected/a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534-kube-api-access-fsg5k\") pod \"observability-operator-6dd7dd855f-shlz7\" (UID: \"a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534\") " pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.119550 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.375299 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm" event={"ID":"bcf530c8-afe8-4a0e-9e5c-bfd85712e37a","Type":"ContainerStarted","Data":"39c5408e3451690ad2f74d4ac71ce5fc5253d0be296d6c14aacd956916b99bd1"} Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.440720 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-78548ff687-rjvkn"] Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.449836 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.454785 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.455148 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-zf5z7" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.489578 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-78548ff687-rjvkn"] Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.504319 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78"] Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.551291 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/58d05d23-3632-4b84-94f8-1db548b90a03-apiservice-cert\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.551350 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfgn6\" (UniqueName: \"kubernetes.io/projected/58d05d23-3632-4b84-94f8-1db548b90a03-kube-api-access-bfgn6\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.551384 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/58d05d23-3632-4b84-94f8-1db548b90a03-openshift-service-ca\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.551425 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/58d05d23-3632-4b84-94f8-1db548b90a03-webhook-cert\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.635361 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2"] Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.654629 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/58d05d23-3632-4b84-94f8-1db548b90a03-openshift-service-ca\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.654718 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/58d05d23-3632-4b84-94f8-1db548b90a03-webhook-cert\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.654767 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/58d05d23-3632-4b84-94f8-1db548b90a03-apiservice-cert\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.654789 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfgn6\" (UniqueName: \"kubernetes.io/projected/58d05d23-3632-4b84-94f8-1db548b90a03-kube-api-access-bfgn6\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.656831 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/58d05d23-3632-4b84-94f8-1db548b90a03-openshift-service-ca\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.670685 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/58d05d23-3632-4b84-94f8-1db548b90a03-webhook-cert\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: W0319 00:20:49.672318 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5c4fe84_51cb_479a_a8cc_2e07bde21417.slice/crio-7cb52939b4a9f8de254a9ba4126ceeedf95c42bf8524807229ce7b9648861386 WatchSource:0}: Error finding container 7cb52939b4a9f8de254a9ba4126ceeedf95c42bf8524807229ce7b9648861386: Status 404 returned error can't find the container with id 7cb52939b4a9f8de254a9ba4126ceeedf95c42bf8524807229ce7b9648861386 Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.683720 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/58d05d23-3632-4b84-94f8-1db548b90a03-apiservice-cert\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.684478 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfgn6\" (UniqueName: \"kubernetes.io/projected/58d05d23-3632-4b84-94f8-1db548b90a03-kube-api-access-bfgn6\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.786408 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.854739 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-shlz7"] Mar 19 00:20:49 crc kubenswrapper[4745]: W0319 00:20:49.882861 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7c2afa4_a0d4_4aad_b6ad_31b7cb4c9534.slice/crio-c59adb8ac4c17681a5c895b4f642b81a9b8961395ec6d146a3f313f2e26400bb WatchSource:0}: Error finding container c59adb8ac4c17681a5c895b4f642b81a9b8961395ec6d146a3f313f2e26400bb: Status 404 returned error can't find the container with id c59adb8ac4c17681a5c895b4f642b81a9b8961395ec6d146a3f313f2e26400bb Mar 19 00:20:50 crc kubenswrapper[4745]: I0319 00:20:50.104981 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-78548ff687-rjvkn"] Mar 19 00:20:50 crc kubenswrapper[4745]: W0319 00:20:50.139312 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58d05d23_3632_4b84_94f8_1db548b90a03.slice/crio-03edfc4bbcfc08847196ab41d37cd2e71029e4e8306f3387489c0d0b5e64e6dc WatchSource:0}: Error finding container 03edfc4bbcfc08847196ab41d37cd2e71029e4e8306f3387489c0d0b5e64e6dc: Status 404 returned error can't find the container with id 03edfc4bbcfc08847196ab41d37cd2e71029e4e8306f3387489c0d0b5e64e6dc Mar 19 00:20:50 crc kubenswrapper[4745]: I0319 00:20:50.403495 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" event={"ID":"f5c4fe84-51cb-479a-a8cc-2e07bde21417","Type":"ContainerStarted","Data":"7cb52939b4a9f8de254a9ba4126ceeedf95c42bf8524807229ce7b9648861386"} Mar 19 00:20:50 crc kubenswrapper[4745]: I0319 00:20:50.404961 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" event={"ID":"69d14850-5c50-4c06-8581-2a70644c7de7","Type":"ContainerStarted","Data":"1ae45c16e4eb0987ccf1106b6425fc706c42ab5acb437b54bf7fa9c7f187c3ab"} Mar 19 00:20:50 crc kubenswrapper[4745]: I0319 00:20:50.406556 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" event={"ID":"a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534","Type":"ContainerStarted","Data":"c59adb8ac4c17681a5c895b4f642b81a9b8961395ec6d146a3f313f2e26400bb"} Mar 19 00:20:50 crc kubenswrapper[4745]: I0319 00:20:50.408639 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-78548ff687-rjvkn" event={"ID":"58d05d23-3632-4b84-94f8-1db548b90a03","Type":"ContainerStarted","Data":"03edfc4bbcfc08847196ab41d37cd2e71029e4e8306f3387489c0d0b5e64e6dc"} Mar 19 00:20:53 crc kubenswrapper[4745]: I0319 00:20:53.257838 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-f6stw"] Mar 19 00:20:53 crc kubenswrapper[4745]: I0319 00:20:53.259352 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-f6stw" Mar 19 00:20:53 crc kubenswrapper[4745]: I0319 00:20:53.262321 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-pwn48" Mar 19 00:20:53 crc kubenswrapper[4745]: I0319 00:20:53.262582 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Mar 19 00:20:53 crc kubenswrapper[4745]: I0319 00:20:53.266352 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Mar 19 00:20:53 crc kubenswrapper[4745]: I0319 00:20:53.363444 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpkxz\" (UniqueName: \"kubernetes.io/projected/9f239a73-6d47-4e9b-a74e-f97757ec8e4f-kube-api-access-wpkxz\") pod \"interconnect-operator-5bb49f789d-f6stw\" (UID: \"9f239a73-6d47-4e9b-a74e-f97757ec8e4f\") " pod="service-telemetry/interconnect-operator-5bb49f789d-f6stw" Mar 19 00:20:53 crc kubenswrapper[4745]: I0319 00:20:53.465086 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpkxz\" (UniqueName: \"kubernetes.io/projected/9f239a73-6d47-4e9b-a74e-f97757ec8e4f-kube-api-access-wpkxz\") pod \"interconnect-operator-5bb49f789d-f6stw\" (UID: \"9f239a73-6d47-4e9b-a74e-f97757ec8e4f\") " pod="service-telemetry/interconnect-operator-5bb49f789d-f6stw" Mar 19 00:20:53 crc kubenswrapper[4745]: I0319 00:20:53.492388 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpkxz\" (UniqueName: \"kubernetes.io/projected/9f239a73-6d47-4e9b-a74e-f97757ec8e4f-kube-api-access-wpkxz\") pod \"interconnect-operator-5bb49f789d-f6stw\" (UID: \"9f239a73-6d47-4e9b-a74e-f97757ec8e4f\") " pod="service-telemetry/interconnect-operator-5bb49f789d-f6stw" Mar 19 00:20:53 crc kubenswrapper[4745]: I0319 00:20:53.537444 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-f6stw"] Mar 19 00:20:53 crc kubenswrapper[4745]: I0319 00:20:53.586550 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-f6stw" Mar 19 00:20:55 crc kubenswrapper[4745]: I0319 00:20:55.904039 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-64749dc85d-gzlt5"] Mar 19 00:20:55 crc kubenswrapper[4745]: I0319 00:20:55.906712 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:55 crc kubenswrapper[4745]: I0319 00:20:55.910201 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Mar 19 00:20:55 crc kubenswrapper[4745]: I0319 00:20:55.911857 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-qp99d" Mar 19 00:20:55 crc kubenswrapper[4745]: I0319 00:20:55.923218 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-64749dc85d-gzlt5"] Mar 19 00:20:55 crc kubenswrapper[4745]: I0319 00:20:55.948046 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5ba244a1-0d6a-4cab-84c4-51501f3c7916-webhook-cert\") pod \"elastic-operator-64749dc85d-gzlt5\" (UID: \"5ba244a1-0d6a-4cab-84c4-51501f3c7916\") " pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:55 crc kubenswrapper[4745]: I0319 00:20:55.948125 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6qtj\" (UniqueName: \"kubernetes.io/projected/5ba244a1-0d6a-4cab-84c4-51501f3c7916-kube-api-access-c6qtj\") pod \"elastic-operator-64749dc85d-gzlt5\" (UID: \"5ba244a1-0d6a-4cab-84c4-51501f3c7916\") " pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:55 crc kubenswrapper[4745]: I0319 00:20:55.948217 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5ba244a1-0d6a-4cab-84c4-51501f3c7916-apiservice-cert\") pod \"elastic-operator-64749dc85d-gzlt5\" (UID: \"5ba244a1-0d6a-4cab-84c4-51501f3c7916\") " pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:56 crc kubenswrapper[4745]: I0319 00:20:56.049198 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5ba244a1-0d6a-4cab-84c4-51501f3c7916-apiservice-cert\") pod \"elastic-operator-64749dc85d-gzlt5\" (UID: \"5ba244a1-0d6a-4cab-84c4-51501f3c7916\") " pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:56 crc kubenswrapper[4745]: I0319 00:20:56.049273 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5ba244a1-0d6a-4cab-84c4-51501f3c7916-webhook-cert\") pod \"elastic-operator-64749dc85d-gzlt5\" (UID: \"5ba244a1-0d6a-4cab-84c4-51501f3c7916\") " pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:56 crc kubenswrapper[4745]: I0319 00:20:56.049305 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6qtj\" (UniqueName: \"kubernetes.io/projected/5ba244a1-0d6a-4cab-84c4-51501f3c7916-kube-api-access-c6qtj\") pod \"elastic-operator-64749dc85d-gzlt5\" (UID: \"5ba244a1-0d6a-4cab-84c4-51501f3c7916\") " pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:56 crc kubenswrapper[4745]: I0319 00:20:56.056455 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5ba244a1-0d6a-4cab-84c4-51501f3c7916-apiservice-cert\") pod \"elastic-operator-64749dc85d-gzlt5\" (UID: \"5ba244a1-0d6a-4cab-84c4-51501f3c7916\") " pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:56 crc kubenswrapper[4745]: I0319 00:20:56.068741 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5ba244a1-0d6a-4cab-84c4-51501f3c7916-webhook-cert\") pod \"elastic-operator-64749dc85d-gzlt5\" (UID: \"5ba244a1-0d6a-4cab-84c4-51501f3c7916\") " pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:56 crc kubenswrapper[4745]: I0319 00:20:56.096691 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6qtj\" (UniqueName: \"kubernetes.io/projected/5ba244a1-0d6a-4cab-84c4-51501f3c7916-kube-api-access-c6qtj\") pod \"elastic-operator-64749dc85d-gzlt5\" (UID: \"5ba244a1-0d6a-4cab-84c4-51501f3c7916\") " pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:56 crc kubenswrapper[4745]: I0319 00:20:56.227583 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:57 crc kubenswrapper[4745]: I0319 00:20:57.895928 4745 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 00:21:04 crc kubenswrapper[4745]: E0319 00:21:04.842976 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:161082f81c8c77471a421b3b4bcb8a47ca64aa08a5dd1abf27e7f2f964b35a2a" Mar 19 00:21:04 crc kubenswrapper[4745]: E0319 00:21:04.843690 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:161082f81c8c77471a421b3b4bcb8a47ca64aa08a5dd1abf27e7f2f964b35a2a,Command:[],Args:[--prometheus-config-reloader=$(RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER) --prometheus-instance-selector=app.kubernetes.io/managed-by=observability-operator --alertmanager-instance-selector=app.kubernetes.io/managed-by=observability-operator --thanos-ruler-instance-selector=app.kubernetes.io/managed-by=observability-operator --watch-referenced-objects-in-all-namespaces=true --disable-unmanaged-prometheus-configuration=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOGC,Value:30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER,Value:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:e4412f5688c9725f36d2f566f624d82a1a2a5b957686245fd2defcc39604bdc2,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.4.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{157286400 0} {} 150Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c6zt6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-8ff7d675-tz2rm_openshift-operators(bcf530c8-afe8-4a0e-9e5c-bfd85712e37a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:21:04 crc kubenswrapper[4745]: E0319 00:21:04.844866 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm" podUID="bcf530c8-afe8-4a0e-9e5c-bfd85712e37a" Mar 19 00:21:04 crc kubenswrapper[4745]: E0319 00:21:04.966027 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:161082f81c8c77471a421b3b4bcb8a47ca64aa08a5dd1abf27e7f2f964b35a2a\\\"\"" pod="openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm" podUID="bcf530c8-afe8-4a0e-9e5c-bfd85712e37a" Mar 19 00:21:05 crc kubenswrapper[4745]: E0319 00:21:05.317174 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:fb1030480e5a55ead0d9748615a2e4b9228522f14b77a782f44407883c24ba93" Mar 19 00:21:05 crc kubenswrapper[4745]: E0319 00:21:05.317417 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:fb1030480e5a55ead0d9748615a2e4b9228522f14b77a782f44407883c24ba93,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.4.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2_openshift-operators(f5c4fe84-51cb-479a-a8cc-2e07bde21417): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:21:05 crc kubenswrapper[4745]: E0319 00:21:05.318578 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" podUID="f5c4fe84-51cb-479a-a8cc-2e07bde21417" Mar 19 00:21:05 crc kubenswrapper[4745]: E0319 00:21:05.362840 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:fb1030480e5a55ead0d9748615a2e4b9228522f14b77a782f44407883c24ba93" Mar 19 00:21:05 crc kubenswrapper[4745]: E0319 00:21:05.363072 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:fb1030480e5a55ead0d9748615a2e4b9228522f14b77a782f44407883c24ba93,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.4.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78_openshift-operators(69d14850-5c50-4c06-8581-2a70644c7de7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:21:05 crc kubenswrapper[4745]: E0319 00:21:05.364202 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" podUID="69d14850-5c50-4c06-8581-2a70644c7de7" Mar 19 00:21:05 crc kubenswrapper[4745]: E0319 00:21:05.971857 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:fb1030480e5a55ead0d9748615a2e4b9228522f14b77a782f44407883c24ba93\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" podUID="f5c4fe84-51cb-479a-a8cc-2e07bde21417" Mar 19 00:21:05 crc kubenswrapper[4745]: E0319 00:21:05.972053 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:fb1030480e5a55ead0d9748615a2e4b9228522f14b77a782f44407883c24ba93\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" podUID="69d14850-5c50-4c06-8581-2a70644c7de7" Mar 19 00:21:09 crc kubenswrapper[4745]: I0319 00:21:09.645137 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-64749dc85d-gzlt5"] Mar 19 00:21:09 crc kubenswrapper[4745]: W0319 00:21:09.662100 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ba244a1_0d6a_4cab_84c4_51501f3c7916.slice/crio-1a016190fd3fe28d0c775889b96ccf640bffe43c012b7e28c066fb90024e3630 WatchSource:0}: Error finding container 1a016190fd3fe28d0c775889b96ccf640bffe43c012b7e28c066fb90024e3630: Status 404 returned error can't find the container with id 1a016190fd3fe28d0c775889b96ccf640bffe43c012b7e28c066fb90024e3630 Mar 19 00:21:09 crc kubenswrapper[4745]: I0319 00:21:09.664781 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 00:21:09 crc kubenswrapper[4745]: I0319 00:21:09.680192 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-f6stw"] Mar 19 00:21:09 crc kubenswrapper[4745]: W0319 00:21:09.692579 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f239a73_6d47_4e9b_a74e_f97757ec8e4f.slice/crio-31da37debf7775a6ca4da7f0df9ec13007ee1ad1534271277c01dca91211ffdf WatchSource:0}: Error finding container 31da37debf7775a6ca4da7f0df9ec13007ee1ad1534271277c01dca91211ffdf: Status 404 returned error can't find the container with id 31da37debf7775a6ca4da7f0df9ec13007ee1ad1534271277c01dca91211ffdf Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.083912 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" event={"ID":"a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534","Type":"ContainerStarted","Data":"a40e618f9855ef6ca3e897be6a0a37a8d7e52ce962bccc20fcf303fb42ceddcb"} Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.085535 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.087401 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.088559 4745 generic.go:334] "Generic (PLEG): container finished" podID="4ef969f2-d76b-405e-baaf-c10a36d36ed3" containerID="0a0bc5818b6461e81b5a0781f254c9da8578165e3c8d81b051eb92f1dadd585e" exitCode=0 Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.088640 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" event={"ID":"4ef969f2-d76b-405e-baaf-c10a36d36ed3","Type":"ContainerDied","Data":"0a0bc5818b6461e81b5a0781f254c9da8578165e3c8d81b051eb92f1dadd585e"} Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.092334 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-78548ff687-rjvkn" event={"ID":"58d05d23-3632-4b84-94f8-1db548b90a03","Type":"ContainerStarted","Data":"be55ef8f11a080e93ad68459f15d472dd4f7bd48a72237eaeb2ed8e11dd72157"} Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.092481 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.093788 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" event={"ID":"5ba244a1-0d6a-4cab-84c4-51501f3c7916","Type":"ContainerStarted","Data":"1a016190fd3fe28d0c775889b96ccf640bffe43c012b7e28c066fb90024e3630"} Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.094819 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-f6stw" event={"ID":"9f239a73-6d47-4e9b-a74e-f97757ec8e4f","Type":"ContainerStarted","Data":"31da37debf7775a6ca4da7f0df9ec13007ee1ad1534271277c01dca91211ffdf"} Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.588094 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" podStartSLOduration=3.6359071849999998 podStartE2EDuration="22.588072634s" podCreationTimestamp="2026-03-19 00:20:48 +0000 UTC" firstStartedPulling="2026-03-19 00:20:49.902964667 +0000 UTC m=+814.441159798" lastFinishedPulling="2026-03-19 00:21:08.855130116 +0000 UTC m=+833.393325247" observedRunningTime="2026-03-19 00:21:10.582922104 +0000 UTC m=+835.121117245" watchObservedRunningTime="2026-03-19 00:21:10.588072634 +0000 UTC m=+835.126267765" Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.777331 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-78548ff687-rjvkn" podStartSLOduration=3.08036264 podStartE2EDuration="21.777308206s" podCreationTimestamp="2026-03-19 00:20:49 +0000 UTC" firstStartedPulling="2026-03-19 00:20:50.151789123 +0000 UTC m=+814.689984254" lastFinishedPulling="2026-03-19 00:21:08.848734689 +0000 UTC m=+833.386929820" observedRunningTime="2026-03-19 00:21:10.678428133 +0000 UTC m=+835.216623264" watchObservedRunningTime="2026-03-19 00:21:10.777308206 +0000 UTC m=+835.315503337" Mar 19 00:21:11 crc kubenswrapper[4745]: I0319 00:21:11.156562 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" event={"ID":"4ef969f2-d76b-405e-baaf-c10a36d36ed3","Type":"ContainerStarted","Data":"7c68b3f23d960dc40d0683e2267b0f117a83d96c74d92e09b8cd76b3b2b28942"} Mar 19 00:21:11 crc kubenswrapper[4745]: I0319 00:21:11.216275 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" podStartSLOduration=5.748639025 podStartE2EDuration="28.216244664s" podCreationTimestamp="2026-03-19 00:20:43 +0000 UTC" firstStartedPulling="2026-03-19 00:20:46.310738304 +0000 UTC m=+810.848933435" lastFinishedPulling="2026-03-19 00:21:08.778343943 +0000 UTC m=+833.316539074" observedRunningTime="2026-03-19 00:21:11.213292303 +0000 UTC m=+835.751487464" watchObservedRunningTime="2026-03-19 00:21:11.216244664 +0000 UTC m=+835.754439795" Mar 19 00:21:12 crc kubenswrapper[4745]: I0319 00:21:12.167603 4745 generic.go:334] "Generic (PLEG): container finished" podID="4ef969f2-d76b-405e-baaf-c10a36d36ed3" containerID="7c68b3f23d960dc40d0683e2267b0f117a83d96c74d92e09b8cd76b3b2b28942" exitCode=0 Mar 19 00:21:12 crc kubenswrapper[4745]: I0319 00:21:12.168577 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" event={"ID":"4ef969f2-d76b-405e-baaf-c10a36d36ed3","Type":"ContainerDied","Data":"7c68b3f23d960dc40d0683e2267b0f117a83d96c74d92e09b8cd76b3b2b28942"} Mar 19 00:21:13 crc kubenswrapper[4745]: I0319 00:21:13.489090 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:21:13 crc kubenswrapper[4745]: I0319 00:21:13.505732 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-bundle\") pod \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " Mar 19 00:21:13 crc kubenswrapper[4745]: I0319 00:21:13.505854 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4z5l\" (UniqueName: \"kubernetes.io/projected/4ef969f2-d76b-405e-baaf-c10a36d36ed3-kube-api-access-m4z5l\") pod \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " Mar 19 00:21:13 crc kubenswrapper[4745]: I0319 00:21:13.506046 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-util\") pod \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " Mar 19 00:21:13 crc kubenswrapper[4745]: I0319 00:21:13.508325 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-bundle" (OuterVolumeSpecName: "bundle") pod "4ef969f2-d76b-405e-baaf-c10a36d36ed3" (UID: "4ef969f2-d76b-405e-baaf-c10a36d36ed3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:21:13 crc kubenswrapper[4745]: I0319 00:21:13.514211 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ef969f2-d76b-405e-baaf-c10a36d36ed3-kube-api-access-m4z5l" (OuterVolumeSpecName: "kube-api-access-m4z5l") pod "4ef969f2-d76b-405e-baaf-c10a36d36ed3" (UID: "4ef969f2-d76b-405e-baaf-c10a36d36ed3"). InnerVolumeSpecName "kube-api-access-m4z5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:21:13 crc kubenswrapper[4745]: I0319 00:21:13.527974 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-util" (OuterVolumeSpecName: "util") pod "4ef969f2-d76b-405e-baaf-c10a36d36ed3" (UID: "4ef969f2-d76b-405e-baaf-c10a36d36ed3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:21:13 crc kubenswrapper[4745]: I0319 00:21:13.607034 4745 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-util\") on node \"crc\" DevicePath \"\"" Mar 19 00:21:13 crc kubenswrapper[4745]: I0319 00:21:13.607074 4745 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:21:13 crc kubenswrapper[4745]: I0319 00:21:13.607089 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4z5l\" (UniqueName: \"kubernetes.io/projected/4ef969f2-d76b-405e-baaf-c10a36d36ed3-kube-api-access-m4z5l\") on node \"crc\" DevicePath \"\"" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.188413 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" event={"ID":"4ef969f2-d76b-405e-baaf-c10a36d36ed3","Type":"ContainerDied","Data":"3ef7a137321f30be497e8cc61b9023655954715f64b2bf473455bf5c69e5a016"} Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.188479 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ef7a137321f30be497e8cc61b9023655954715f64b2bf473455bf5c69e5a016" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.188552 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.642296 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mtlwq"] Mar 19 00:21:14 crc kubenswrapper[4745]: E0319 00:21:14.642571 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef969f2-d76b-405e-baaf-c10a36d36ed3" containerName="pull" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.642586 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef969f2-d76b-405e-baaf-c10a36d36ed3" containerName="pull" Mar 19 00:21:14 crc kubenswrapper[4745]: E0319 00:21:14.642596 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef969f2-d76b-405e-baaf-c10a36d36ed3" containerName="util" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.642603 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef969f2-d76b-405e-baaf-c10a36d36ed3" containerName="util" Mar 19 00:21:14 crc kubenswrapper[4745]: E0319 00:21:14.642619 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef969f2-d76b-405e-baaf-c10a36d36ed3" containerName="extract" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.642627 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef969f2-d76b-405e-baaf-c10a36d36ed3" containerName="extract" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.642728 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ef969f2-d76b-405e-baaf-c10a36d36ed3" containerName="extract" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.643649 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.714403 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mtlwq"] Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.722543 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hfvm\" (UniqueName: \"kubernetes.io/projected/2886a73a-35b3-4014-ab67-1b88fa88b4d8-kube-api-access-8hfvm\") pod \"redhat-operators-mtlwq\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.722610 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-catalog-content\") pod \"redhat-operators-mtlwq\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.722649 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-utilities\") pod \"redhat-operators-mtlwq\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.824105 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hfvm\" (UniqueName: \"kubernetes.io/projected/2886a73a-35b3-4014-ab67-1b88fa88b4d8-kube-api-access-8hfvm\") pod \"redhat-operators-mtlwq\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.824160 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-catalog-content\") pod \"redhat-operators-mtlwq\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.824183 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-utilities\") pod \"redhat-operators-mtlwq\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.824625 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-utilities\") pod \"redhat-operators-mtlwq\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.825021 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-catalog-content\") pod \"redhat-operators-mtlwq\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:15 crc kubenswrapper[4745]: I0319 00:21:15.186201 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hfvm\" (UniqueName: \"kubernetes.io/projected/2886a73a-35b3-4014-ab67-1b88fa88b4d8-kube-api-access-8hfvm\") pod \"redhat-operators-mtlwq\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:15 crc kubenswrapper[4745]: I0319 00:21:15.260561 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:15 crc kubenswrapper[4745]: I0319 00:21:15.605591 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:21:15 crc kubenswrapper[4745]: I0319 00:21:15.605657 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:21:15 crc kubenswrapper[4745]: I0319 00:21:15.605706 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:21:15 crc kubenswrapper[4745]: I0319 00:21:15.606650 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5fdbb323d46092e1c016d6578086be40b0bfff7d18498b3c00eacd6a8fbf018"} pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 00:21:15 crc kubenswrapper[4745]: I0319 00:21:15.606723 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" containerID="cri-o://c5fdbb323d46092e1c016d6578086be40b0bfff7d18498b3c00eacd6a8fbf018" gracePeriod=600 Mar 19 00:21:15 crc kubenswrapper[4745]: E0319 00:21:15.746056 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod400972f4_050f_4f26_b982_ced6f2590c8b.slice/crio-conmon-c5fdbb323d46092e1c016d6578086be40b0bfff7d18498b3c00eacd6a8fbf018.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod400972f4_050f_4f26_b982_ced6f2590c8b.slice/crio-c5fdbb323d46092e1c016d6578086be40b0bfff7d18498b3c00eacd6a8fbf018.scope\": RecentStats: unable to find data in memory cache]" Mar 19 00:21:16 crc kubenswrapper[4745]: I0319 00:21:16.252123 4745 generic.go:334] "Generic (PLEG): container finished" podID="400972f4-050f-4f26-b982-ced6f2590c8b" containerID="c5fdbb323d46092e1c016d6578086be40b0bfff7d18498b3c00eacd6a8fbf018" exitCode=0 Mar 19 00:21:16 crc kubenswrapper[4745]: I0319 00:21:16.252689 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerDied","Data":"c5fdbb323d46092e1c016d6578086be40b0bfff7d18498b3c00eacd6a8fbf018"} Mar 19 00:21:16 crc kubenswrapper[4745]: I0319 00:21:16.252750 4745 scope.go:117] "RemoveContainer" containerID="051b71cd1adf671f1f1535a2cdc76bfdb90671964e234784e73bcde9b37bd06d" Mar 19 00:21:16 crc kubenswrapper[4745]: I0319 00:21:16.778763 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mtlwq"] Mar 19 00:21:17 crc kubenswrapper[4745]: I0319 00:21:17.265397 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" event={"ID":"5ba244a1-0d6a-4cab-84c4-51501f3c7916","Type":"ContainerStarted","Data":"a2d2bb0f1656421fad969dd9938608d7c1f3a0c6742aaaefc4088efa505ad3db"} Mar 19 00:21:17 crc kubenswrapper[4745]: I0319 00:21:17.297138 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"b7b021cd8b07360e8af6249aac1835e212578d41089c112a8709760bee2deb06"} Mar 19 00:21:17 crc kubenswrapper[4745]: I0319 00:21:17.310093 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtlwq" event={"ID":"2886a73a-35b3-4014-ab67-1b88fa88b4d8","Type":"ContainerStarted","Data":"160fc9f43fde151a57b67e2c8925b999f7f111696e51e7dd87c757005709ba3f"} Mar 19 00:21:17 crc kubenswrapper[4745]: I0319 00:21:17.343196 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" podStartSLOduration=15.756627829 podStartE2EDuration="22.343177996s" podCreationTimestamp="2026-03-19 00:20:55 +0000 UTC" firstStartedPulling="2026-03-19 00:21:09.664535923 +0000 UTC m=+834.202731054" lastFinishedPulling="2026-03-19 00:21:16.25108609 +0000 UTC m=+840.789281221" observedRunningTime="2026-03-19 00:21:17.310904773 +0000 UTC m=+841.849099904" watchObservedRunningTime="2026-03-19 00:21:17.343177996 +0000 UTC m=+841.881373127" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.350352 4745 generic.go:334] "Generic (PLEG): container finished" podID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerID="4d2e067269ab8d6ad4ff9fd9c0c25e46f9248271c2c159ef4b71cf8df753032d" exitCode=0 Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.350550 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtlwq" event={"ID":"2886a73a-35b3-4014-ab67-1b88fa88b4d8","Type":"ContainerDied","Data":"4d2e067269ab8d6ad4ff9fd9c0c25e46f9248271c2c159ef4b71cf8df753032d"} Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.357333 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm" event={"ID":"bcf530c8-afe8-4a0e-9e5c-bfd85712e37a","Type":"ContainerStarted","Data":"5e7bbf77fd0253e4cc85b5b6317591552343f2a1875ac824d63bdc7e790b1814"} Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.422277 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm" podStartSLOduration=3.457659316 podStartE2EDuration="31.422258392s" podCreationTimestamp="2026-03-19 00:20:47 +0000 UTC" firstStartedPulling="2026-03-19 00:20:48.603006074 +0000 UTC m=+813.141201205" lastFinishedPulling="2026-03-19 00:21:16.56760515 +0000 UTC m=+841.105800281" observedRunningTime="2026-03-19 00:21:18.419305491 +0000 UTC m=+842.957500642" watchObservedRunningTime="2026-03-19 00:21:18.422258392 +0000 UTC m=+842.960453523" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.583852 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.585045 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.594142 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.594616 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.594666 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-p9t9n" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.594159 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.594253 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.594339 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.594420 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.594494 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.594567 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.617837 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733029 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733091 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733126 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733146 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733166 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733196 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733223 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733238 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733262 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733287 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733306 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733323 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733340 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/884040c3-6c56-45b0-881d-e73f52c0ab34-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733361 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733387 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835280 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835383 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835414 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835438 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835472 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835502 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835521 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835538 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835566 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835588 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835610 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835641 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/884040c3-6c56-45b0-881d-e73f52c0ab34-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835669 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835695 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835728 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.836016 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.836541 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.836535 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.844456 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.845320 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.847267 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.847454 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.847695 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.847844 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.848294 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/884040c3-6c56-45b0-881d-e73f52c0ab34-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.848347 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.848798 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.855173 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.871616 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.883637 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.907289 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:19 crc kubenswrapper[4745]: I0319 00:21:19.790461 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:21:27 crc kubenswrapper[4745]: I0319 00:21:27.938734 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g"] Mar 19 00:21:27 crc kubenswrapper[4745]: I0319 00:21:27.940702 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" Mar 19 00:21:27 crc kubenswrapper[4745]: I0319 00:21:27.944358 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 19 00:21:27 crc kubenswrapper[4745]: I0319 00:21:27.945169 4745 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-v22wp" Mar 19 00:21:27 crc kubenswrapper[4745]: I0319 00:21:27.946397 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 19 00:21:27 crc kubenswrapper[4745]: I0319 00:21:27.966492 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g"] Mar 19 00:21:28 crc kubenswrapper[4745]: I0319 00:21:28.110781 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5ltd\" (UniqueName: \"kubernetes.io/projected/1340e1bb-a8aa-4a0c-b295-f49f94e81055-kube-api-access-q5ltd\") pod \"cert-manager-operator-controller-manager-5586865c96-2hk9g\" (UID: \"1340e1bb-a8aa-4a0c-b295-f49f94e81055\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" Mar 19 00:21:28 crc kubenswrapper[4745]: I0319 00:21:28.110916 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1340e1bb-a8aa-4a0c-b295-f49f94e81055-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-2hk9g\" (UID: \"1340e1bb-a8aa-4a0c-b295-f49f94e81055\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" Mar 19 00:21:28 crc kubenswrapper[4745]: I0319 00:21:28.531590 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1340e1bb-a8aa-4a0c-b295-f49f94e81055-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-2hk9g\" (UID: \"1340e1bb-a8aa-4a0c-b295-f49f94e81055\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" Mar 19 00:21:28 crc kubenswrapper[4745]: I0319 00:21:28.531705 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5ltd\" (UniqueName: \"kubernetes.io/projected/1340e1bb-a8aa-4a0c-b295-f49f94e81055-kube-api-access-q5ltd\") pod \"cert-manager-operator-controller-manager-5586865c96-2hk9g\" (UID: \"1340e1bb-a8aa-4a0c-b295-f49f94e81055\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" Mar 19 00:21:28 crc kubenswrapper[4745]: I0319 00:21:28.533133 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1340e1bb-a8aa-4a0c-b295-f49f94e81055-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-2hk9g\" (UID: \"1340e1bb-a8aa-4a0c-b295-f49f94e81055\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" Mar 19 00:21:28 crc kubenswrapper[4745]: I0319 00:21:28.594932 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5ltd\" (UniqueName: \"kubernetes.io/projected/1340e1bb-a8aa-4a0c-b295-f49f94e81055-kube-api-access-q5ltd\") pod \"cert-manager-operator-controller-manager-5586865c96-2hk9g\" (UID: \"1340e1bb-a8aa-4a0c-b295-f49f94e81055\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" Mar 19 00:21:28 crc kubenswrapper[4745]: I0319 00:21:28.870642 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" Mar 19 00:21:32 crc kubenswrapper[4745]: E0319 00:21:32.485986 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43" Mar 19 00:21:32 crc kubenswrapper[4745]: E0319 00:21:32.486754 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:interconnect-operator,Image:registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43,Command:[qdr-operator],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:60000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:qdr-operator,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_QDROUTERD_IMAGE,Value:registry.redhat.io/amq7/amq-interconnect@sha256:31d87473fa684178a694f9ee331d3c80f2653f9533cb65c2a325752166a077e9,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:amq7-interconnect-operator.v1.10.20,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wpkxz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod interconnect-operator-5bb49f789d-f6stw_service-telemetry(9f239a73-6d47-4e9b-a74e-f97757ec8e4f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:21:32 crc kubenswrapper[4745]: E0319 00:21:32.488228 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"interconnect-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="service-telemetry/interconnect-operator-5bb49f789d-f6stw" podUID="9f239a73-6d47-4e9b-a74e-f97757ec8e4f" Mar 19 00:21:32 crc kubenswrapper[4745]: E0319 00:21:32.903004 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"interconnect-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43\\\"\"" pod="service-telemetry/interconnect-operator-5bb49f789d-f6stw" podUID="9f239a73-6d47-4e9b-a74e-f97757ec8e4f" Mar 19 00:21:33 crc kubenswrapper[4745]: I0319 00:21:33.213684 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 19 00:21:33 crc kubenswrapper[4745]: I0319 00:21:33.222744 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g"] Mar 19 00:21:33 crc kubenswrapper[4745]: I0319 00:21:33.926213 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"884040c3-6c56-45b0-881d-e73f52c0ab34","Type":"ContainerStarted","Data":"d7bb1db6b8b2ffad83723d95e989fc91204d27db1c2888598e44eba176edc90c"} Mar 19 00:21:33 crc kubenswrapper[4745]: I0319 00:21:33.928427 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" event={"ID":"f5c4fe84-51cb-479a-a8cc-2e07bde21417","Type":"ContainerStarted","Data":"67b90f26a2cc7c0a5914348ece6b1670b8a3005382d0d4bc27217d72c3f1d8fc"} Mar 19 00:21:33 crc kubenswrapper[4745]: I0319 00:21:33.931683 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" event={"ID":"69d14850-5c50-4c06-8581-2a70644c7de7","Type":"ContainerStarted","Data":"08fe4f272cd75074db9f6367a8948a99b41f1f15b9a54e731cc1ffcbb7774578"} Mar 19 00:21:33 crc kubenswrapper[4745]: I0319 00:21:33.934453 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtlwq" event={"ID":"2886a73a-35b3-4014-ab67-1b88fa88b4d8","Type":"ContainerStarted","Data":"24e30658109b4babb9c20dba7ffc475b9dffa17f6513457f28d5dd1a43e9bcfb"} Mar 19 00:21:33 crc kubenswrapper[4745]: I0319 00:21:33.936146 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" event={"ID":"1340e1bb-a8aa-4a0c-b295-f49f94e81055","Type":"ContainerStarted","Data":"759b9734b100c09547930afd66df15fff628cb84b25b4c0fbdd4b1ecd7514cd2"} Mar 19 00:21:34 crc kubenswrapper[4745]: I0319 00:21:34.044779 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" podStartSLOduration=3.153676139 podStartE2EDuration="46.044747697s" podCreationTimestamp="2026-03-19 00:20:48 +0000 UTC" firstStartedPulling="2026-03-19 00:20:49.686324651 +0000 UTC m=+814.224519772" lastFinishedPulling="2026-03-19 00:21:32.577396199 +0000 UTC m=+857.115591330" observedRunningTime="2026-03-19 00:21:33.994863091 +0000 UTC m=+858.533058232" watchObservedRunningTime="2026-03-19 00:21:34.044747697 +0000 UTC m=+858.582942838" Mar 19 00:21:34 crc kubenswrapper[4745]: I0319 00:21:34.046417 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" podStartSLOduration=3.006949643 podStartE2EDuration="46.046406688s" podCreationTimestamp="2026-03-19 00:20:48 +0000 UTC" firstStartedPulling="2026-03-19 00:20:49.539945216 +0000 UTC m=+814.078140347" lastFinishedPulling="2026-03-19 00:21:32.579402261 +0000 UTC m=+857.117597392" observedRunningTime="2026-03-19 00:21:34.036332084 +0000 UTC m=+858.574527245" watchObservedRunningTime="2026-03-19 00:21:34.046406688 +0000 UTC m=+858.584601819" Mar 19 00:21:37 crc kubenswrapper[4745]: I0319 00:21:37.037356 4745 generic.go:334] "Generic (PLEG): container finished" podID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerID="24e30658109b4babb9c20dba7ffc475b9dffa17f6513457f28d5dd1a43e9bcfb" exitCode=0 Mar 19 00:21:37 crc kubenswrapper[4745]: I0319 00:21:37.038192 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtlwq" event={"ID":"2886a73a-35b3-4014-ab67-1b88fa88b4d8","Type":"ContainerDied","Data":"24e30658109b4babb9c20dba7ffc475b9dffa17f6513457f28d5dd1a43e9bcfb"} Mar 19 00:21:38 crc kubenswrapper[4745]: I0319 00:21:38.083996 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtlwq" event={"ID":"2886a73a-35b3-4014-ab67-1b88fa88b4d8","Type":"ContainerStarted","Data":"857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1"} Mar 19 00:21:38 crc kubenswrapper[4745]: I0319 00:21:38.113620 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mtlwq" podStartSLOduration=14.205200094 podStartE2EDuration="24.113588986s" podCreationTimestamp="2026-03-19 00:21:14 +0000 UTC" firstStartedPulling="2026-03-19 00:21:27.739141115 +0000 UTC m=+852.277336256" lastFinishedPulling="2026-03-19 00:21:37.647530017 +0000 UTC m=+862.185725148" observedRunningTime="2026-03-19 00:21:38.110125409 +0000 UTC m=+862.648320550" watchObservedRunningTime="2026-03-19 00:21:38.113588986 +0000 UTC m=+862.651784117" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.708565 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.710199 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.716460 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.716627 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.716709 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.716767 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.773215 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.821011 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.821058 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwpcd\" (UniqueName: \"kubernetes.io/projected/6edd3146-9ede-4a63-b72b-5987ed600bce-kube-api-access-bwpcd\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.821656 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.821734 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.821972 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.822005 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.822060 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.822126 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.822211 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.822238 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.822272 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.822335 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.923930 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924031 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924084 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924113 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwpcd\" (UniqueName: \"kubernetes.io/projected/6edd3146-9ede-4a63-b72b-5987ed600bce-kube-api-access-bwpcd\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924136 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924159 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924184 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924210 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924238 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924270 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924311 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924339 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.925343 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.925857 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.926072 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.926605 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.927449 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.928198 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.928297 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.928364 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.928819 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.934647 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.948021 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwpcd\" (UniqueName: \"kubernetes.io/projected/6edd3146-9ede-4a63-b72b-5987ed600bce-kube-api-access-bwpcd\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.955941 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:45 crc kubenswrapper[4745]: I0319 00:21:45.030017 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:45 crc kubenswrapper[4745]: I0319 00:21:45.261286 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:45 crc kubenswrapper[4745]: I0319 00:21:45.261692 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:46 crc kubenswrapper[4745]: I0319 00:21:46.439206 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mtlwq" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="registry-server" probeResult="failure" output=< Mar 19 00:21:46 crc kubenswrapper[4745]: timeout: failed to connect service ":50051" within 1s Mar 19 00:21:46 crc kubenswrapper[4745]: > Mar 19 00:21:49 crc kubenswrapper[4745]: I0319 00:21:49.299215 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 19 00:21:50 crc kubenswrapper[4745]: I0319 00:21:50.575287 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" event={"ID":"1340e1bb-a8aa-4a0c-b295-f49f94e81055","Type":"ContainerStarted","Data":"e327f92dbc5fff52d1912124f9ce2f0f29ee4ad9e54e3419624e6d02eb6e7ea1"} Mar 19 00:21:50 crc kubenswrapper[4745]: I0319 00:21:50.586490 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"6edd3146-9ede-4a63-b72b-5987ed600bce","Type":"ContainerStarted","Data":"694ce7af8b3166a88da3080fef22e0db15c17a9128111ade8560fc1f11e9a583"} Mar 19 00:21:50 crc kubenswrapper[4745]: I0319 00:21:50.593615 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-f6stw" event={"ID":"9f239a73-6d47-4e9b-a74e-f97757ec8e4f","Type":"ContainerStarted","Data":"2755f249469ad287fc3eb9a78615601d6397f223089eb1ec2a6af5c32365dfda"} Mar 19 00:21:50 crc kubenswrapper[4745]: I0319 00:21:50.618371 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" podStartSLOduration=7.82826577 podStartE2EDuration="23.618336028s" podCreationTimestamp="2026-03-19 00:21:27 +0000 UTC" firstStartedPulling="2026-03-19 00:21:33.23036525 +0000 UTC m=+857.768560381" lastFinishedPulling="2026-03-19 00:21:49.020435518 +0000 UTC m=+873.558630639" observedRunningTime="2026-03-19 00:21:50.608375956 +0000 UTC m=+875.146571097" watchObservedRunningTime="2026-03-19 00:21:50.618336028 +0000 UTC m=+875.156531169" Mar 19 00:21:55 crc kubenswrapper[4745]: I0319 00:21:55.228533 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-f6stw" podStartSLOduration=22.903006186 podStartE2EDuration="1m2.228509265s" podCreationTimestamp="2026-03-19 00:20:53 +0000 UTC" firstStartedPulling="2026-03-19 00:21:09.694918149 +0000 UTC m=+834.233113280" lastFinishedPulling="2026-03-19 00:21:49.020421228 +0000 UTC m=+873.558616359" observedRunningTime="2026-03-19 00:21:50.637618259 +0000 UTC m=+875.175813390" watchObservedRunningTime="2026-03-19 00:21:55.228509265 +0000 UTC m=+879.766704396" Mar 19 00:21:55 crc kubenswrapper[4745]: I0319 00:21:55.239237 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 19 00:21:55 crc kubenswrapper[4745]: I0319 00:21:55.785753 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.109086 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.207550 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mtlwq"] Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.619801 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-vr5md"] Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.621027 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.625336 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.625634 4745 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-2x97z" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.626088 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.644494 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-vr5md"] Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.761153 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtdsz\" (UniqueName: \"kubernetes.io/projected/9f7ceaac-a9f7-467b-83c9-298813ff6323-kube-api-access-mtdsz\") pod \"cert-manager-webhook-6888856db4-vr5md\" (UID: \"9f7ceaac-a9f7-467b-83c9-298813ff6323\") " pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.761554 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9f7ceaac-a9f7-467b-83c9-298813ff6323-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-vr5md\" (UID: \"9f7ceaac-a9f7-467b-83c9-298813ff6323\") " pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.863737 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtdsz\" (UniqueName: \"kubernetes.io/projected/9f7ceaac-a9f7-467b-83c9-298813ff6323-kube-api-access-mtdsz\") pod \"cert-manager-webhook-6888856db4-vr5md\" (UID: \"9f7ceaac-a9f7-467b-83c9-298813ff6323\") " pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.863797 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9f7ceaac-a9f7-467b-83c9-298813ff6323-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-vr5md\" (UID: \"9f7ceaac-a9f7-467b-83c9-298813ff6323\") " pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.905948 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9f7ceaac-a9f7-467b-83c9-298813ff6323-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-vr5md\" (UID: \"9f7ceaac-a9f7-467b-83c9-298813ff6323\") " pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.911972 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtdsz\" (UniqueName: \"kubernetes.io/projected/9f7ceaac-a9f7-467b-83c9-298813ff6323-kube-api-access-mtdsz\") pod \"cert-manager-webhook-6888856db4-vr5md\" (UID: \"9f7ceaac-a9f7-467b-83c9-298813ff6323\") " pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.962242 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.244386 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-cpzpl"] Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.245262 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.247327 4745 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-nv497" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.265767 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-cpzpl"] Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.375212 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bbe8b718-863a-404e-9be9-e872318f1ac0-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-cpzpl\" (UID: \"bbe8b718-863a-404e-9be9-e872318f1ac0\") " pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.375360 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvxxn\" (UniqueName: \"kubernetes.io/projected/bbe8b718-863a-404e-9be9-e872318f1ac0-kube-api-access-xvxxn\") pod \"cert-manager-cainjector-5545bd876-cpzpl\" (UID: \"bbe8b718-863a-404e-9be9-e872318f1ac0\") " pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.477747 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvxxn\" (UniqueName: \"kubernetes.io/projected/bbe8b718-863a-404e-9be9-e872318f1ac0-kube-api-access-xvxxn\") pod \"cert-manager-cainjector-5545bd876-cpzpl\" (UID: \"bbe8b718-863a-404e-9be9-e872318f1ac0\") " pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.477838 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bbe8b718-863a-404e-9be9-e872318f1ac0-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-cpzpl\" (UID: \"bbe8b718-863a-404e-9be9-e872318f1ac0\") " pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.497293 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bbe8b718-863a-404e-9be9-e872318f1ac0-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-cpzpl\" (UID: \"bbe8b718-863a-404e-9be9-e872318f1ac0\") " pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.509685 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvxxn\" (UniqueName: \"kubernetes.io/projected/bbe8b718-863a-404e-9be9-e872318f1ac0-kube-api-access-xvxxn\") pod \"cert-manager-cainjector-5545bd876-cpzpl\" (UID: \"bbe8b718-863a-404e-9be9-e872318f1ac0\") " pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.566333 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.582233 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mtlwq" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="registry-server" containerID="cri-o://857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1" gracePeriod=2 Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.653432 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.655338 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.660173 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.660407 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.660596 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.689080 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.782909 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783056 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783093 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783124 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq4w5\" (UniqueName: \"kubernetes.io/projected/9628a478-fb27-4c42-bcf5-2a329898708b-kube-api-access-zq4w5\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783287 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783412 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783576 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783654 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783699 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783736 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783849 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783926 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.885996 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886083 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886155 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886194 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886240 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq4w5\" (UniqueName: \"kubernetes.io/projected/9628a478-fb27-4c42-bcf5-2a329898708b-kube-api-access-zq4w5\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886299 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886352 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886410 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886457 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886492 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886531 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886590 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886800 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886926 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.887214 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.887297 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.887637 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.887719 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.888140 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.888458 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.892466 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.893275 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.928481 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq4w5\" (UniqueName: \"kubernetes.io/projected/9628a478-fb27-4c42-bcf5-2a329898708b-kube-api-access-zq4w5\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.933499 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:58 crc kubenswrapper[4745]: I0319 00:21:58.047501 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:58 crc kubenswrapper[4745]: I0319 00:21:58.737231 4745 generic.go:334] "Generic (PLEG): container finished" podID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerID="857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1" exitCode=0 Mar 19 00:21:58 crc kubenswrapper[4745]: I0319 00:21:58.737729 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtlwq" event={"ID":"2886a73a-35b3-4014-ab67-1b88fa88b4d8","Type":"ContainerDied","Data":"857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1"} Mar 19 00:22:00 crc kubenswrapper[4745]: I0319 00:22:00.155657 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564662-znhfd"] Mar 19 00:22:00 crc kubenswrapper[4745]: I0319 00:22:00.156587 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564662-znhfd"] Mar 19 00:22:00 crc kubenswrapper[4745]: I0319 00:22:00.156702 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564662-znhfd" Mar 19 00:22:00 crc kubenswrapper[4745]: I0319 00:22:00.160787 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:22:00 crc kubenswrapper[4745]: I0319 00:22:00.161111 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:22:00 crc kubenswrapper[4745]: I0319 00:22:00.161801 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:22:00 crc kubenswrapper[4745]: I0319 00:22:00.322994 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwsgv\" (UniqueName: \"kubernetes.io/projected/deed3a0c-ada3-41b5-895b-8acc45926539-kube-api-access-gwsgv\") pod \"auto-csr-approver-29564662-znhfd\" (UID: \"deed3a0c-ada3-41b5-895b-8acc45926539\") " pod="openshift-infra/auto-csr-approver-29564662-znhfd" Mar 19 00:22:00 crc kubenswrapper[4745]: I0319 00:22:00.425336 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwsgv\" (UniqueName: \"kubernetes.io/projected/deed3a0c-ada3-41b5-895b-8acc45926539-kube-api-access-gwsgv\") pod \"auto-csr-approver-29564662-znhfd\" (UID: \"deed3a0c-ada3-41b5-895b-8acc45926539\") " pod="openshift-infra/auto-csr-approver-29564662-znhfd" Mar 19 00:22:00 crc kubenswrapper[4745]: I0319 00:22:00.452193 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwsgv\" (UniqueName: \"kubernetes.io/projected/deed3a0c-ada3-41b5-895b-8acc45926539-kube-api-access-gwsgv\") pod \"auto-csr-approver-29564662-znhfd\" (UID: \"deed3a0c-ada3-41b5-895b-8acc45926539\") " pod="openshift-infra/auto-csr-approver-29564662-znhfd" Mar 19 00:22:00 crc kubenswrapper[4745]: I0319 00:22:00.479459 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564662-znhfd" Mar 19 00:22:05 crc kubenswrapper[4745]: E0319 00:22:05.262093 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1 is running failed: container process not found" containerID="857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 00:22:05 crc kubenswrapper[4745]: E0319 00:22:05.263931 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1 is running failed: container process not found" containerID="857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 00:22:05 crc kubenswrapper[4745]: E0319 00:22:05.264306 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1 is running failed: container process not found" containerID="857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 00:22:05 crc kubenswrapper[4745]: E0319 00:22:05.264413 4745 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-mtlwq" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="registry-server" Mar 19 00:22:06 crc kubenswrapper[4745]: I0319 00:22:06.761979 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-ptrd5"] Mar 19 00:22:06 crc kubenswrapper[4745]: I0319 00:22:06.763068 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-ptrd5" Mar 19 00:22:06 crc kubenswrapper[4745]: I0319 00:22:06.766203 4745 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-r8wwt" Mar 19 00:22:06 crc kubenswrapper[4745]: I0319 00:22:06.793313 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-ptrd5"] Mar 19 00:22:06 crc kubenswrapper[4745]: I0319 00:22:06.798389 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/93f48ad8-0863-4d90-abac-b887096b386c-bound-sa-token\") pod \"cert-manager-545d4d4674-ptrd5\" (UID: \"93f48ad8-0863-4d90-abac-b887096b386c\") " pod="cert-manager/cert-manager-545d4d4674-ptrd5" Mar 19 00:22:06 crc kubenswrapper[4745]: I0319 00:22:06.798489 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zqdw\" (UniqueName: \"kubernetes.io/projected/93f48ad8-0863-4d90-abac-b887096b386c-kube-api-access-8zqdw\") pod \"cert-manager-545d4d4674-ptrd5\" (UID: \"93f48ad8-0863-4d90-abac-b887096b386c\") " pod="cert-manager/cert-manager-545d4d4674-ptrd5" Mar 19 00:22:06 crc kubenswrapper[4745]: I0319 00:22:06.899102 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zqdw\" (UniqueName: \"kubernetes.io/projected/93f48ad8-0863-4d90-abac-b887096b386c-kube-api-access-8zqdw\") pod \"cert-manager-545d4d4674-ptrd5\" (UID: \"93f48ad8-0863-4d90-abac-b887096b386c\") " pod="cert-manager/cert-manager-545d4d4674-ptrd5" Mar 19 00:22:06 crc kubenswrapper[4745]: I0319 00:22:06.899213 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/93f48ad8-0863-4d90-abac-b887096b386c-bound-sa-token\") pod \"cert-manager-545d4d4674-ptrd5\" (UID: \"93f48ad8-0863-4d90-abac-b887096b386c\") " pod="cert-manager/cert-manager-545d4d4674-ptrd5" Mar 19 00:22:06 crc kubenswrapper[4745]: I0319 00:22:06.922260 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/93f48ad8-0863-4d90-abac-b887096b386c-bound-sa-token\") pod \"cert-manager-545d4d4674-ptrd5\" (UID: \"93f48ad8-0863-4d90-abac-b887096b386c\") " pod="cert-manager/cert-manager-545d4d4674-ptrd5" Mar 19 00:22:07 crc kubenswrapper[4745]: I0319 00:22:07.659650 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zqdw\" (UniqueName: \"kubernetes.io/projected/93f48ad8-0863-4d90-abac-b887096b386c-kube-api-access-8zqdw\") pod \"cert-manager-545d4d4674-ptrd5\" (UID: \"93f48ad8-0863-4d90-abac-b887096b386c\") " pod="cert-manager/cert-manager-545d4d4674-ptrd5" Mar 19 00:22:07 crc kubenswrapper[4745]: I0319 00:22:07.688015 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-ptrd5" Mar 19 00:22:15 crc kubenswrapper[4745]: E0319 00:22:15.263498 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1 is running failed: container process not found" containerID="857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 00:22:15 crc kubenswrapper[4745]: E0319 00:22:15.266707 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1 is running failed: container process not found" containerID="857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 00:22:15 crc kubenswrapper[4745]: E0319 00:22:15.267335 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1 is running failed: container process not found" containerID="857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 00:22:15 crc kubenswrapper[4745]: E0319 00:22:15.267385 4745 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-mtlwq" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="registry-server" Mar 19 00:22:15 crc kubenswrapper[4745]: E0319 00:22:15.980752 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7a908a23111a624c3fa04dc3105a7a97f48ee60105308bbb6ed42a40d63c2fe" Mar 19 00:22:15 crc kubenswrapper[4745]: E0319 00:22:15.981051 4745 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 00:22:15 crc kubenswrapper[4745]: init container &Container{Name:manage-dockerfile,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7a908a23111a624c3fa04dc3105a7a97f48ee60105308bbb6ed42a40d63c2fe,Command:[],Args:[openshift-manage-dockerfile --v=0],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:BUILD,Value:{"kind":"Build","apiVersion":"build.openshift.io/v1","metadata":{"name":"service-telemetry-operator-1","namespace":"service-telemetry","uid":"6c226391-0d08-43e7-b93d-149a01173291","resourceVersion":"34861","generation":1,"creationTimestamp":"2026-03-19T00:21:44Z","labels":{"build":"service-telemetry-operator","buildconfig":"service-telemetry-operator","openshift.io/build-config.name":"service-telemetry-operator","openshift.io/build.start-policy":"Serial"},"annotations":{"openshift.io/build-config.name":"service-telemetry-operator","openshift.io/build.number":"1"},"ownerReferences":[{"apiVersion":"build.openshift.io/v1","kind":"BuildConfig","name":"service-telemetry-operator","uid":"aaf63d90-27c9-4514-a696-f2b813b6c2e3","controller":true}],"managedFields":[{"manager":"openshift-apiserver","operation":"Update","apiVersion":"build.openshift.io/v1","time":"2026-03-19T00:21:44Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:annotations":{".":{},"f:openshift.io/build-config.name":{},"f:openshift.io/build.number":{}},"f:labels":{".":{},"f:build":{},"f:buildconfig":{},"f:openshift.io/build-config.name":{},"f:openshift.io/build.start-policy":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"aaf63d90-27c9-4514-a696-f2b813b6c2e3\"}":{}}},"f:spec":{"f:output":{"f:to":{}},"f:serviceAccount":{},"f:source":{"f:dockerfile":{},"f:type":{}},"f:strategy":{"f:dockerStrategy":{".":{},"f:from":{}},"f:type":{}},"f:triggeredBy":{}},"f:status":{"f:conditions":{".":{},"k:{\"type\":\"New\"}":{".":{},"f:lastTransitionTime":{},"f:lastUpdateTime":{},"f:status":{},"f:type":{}}},"f:config":{},"f:phase":{}}}}]},"spec":{"serviceAccount":"builder","source":{"type":"Dockerfile","dockerfile":"FROM quay.io/operator-framework/ansible-operator:v1.38.1\n\n# temporarily switch to root user to adjust image layers\nUSER 0\n# Upstream CI builds need the additional EPEL sources for python3-passlib and python3-bcrypt but have no working repos to install epel-release\n# NO_PROXY is undefined in upstream CI builds, but defined (usually blank) during openshift builds (a possibly brittle hack)\nRUN bash -c -- 'if [ \"${NO_PROXY:-__ZZZZZ}\" == \"__ZZZZZ\" ]; then echo \"Applying upstream EPEL hacks\" \u0026\u0026 echo -e \"-----BEGIN PGP PUBLIC KEY BLOCK-----\\nmQINBGE3mOsBEACsU+XwJWDJVkItBaugXhXIIkb9oe+7aadELuVo0kBmc3HXt/Yp\\nCJW9hHEiGZ6z2jwgPqyJjZhCvcAWvgzKcvqE+9i0NItV1rzfxrBe2BtUtZmVcuE6\\n2b+SPfxQ2Hr8llaawRjt8BCFX/ZzM4/1Qk+EzlfTcEcpkMf6wdO7kD6ulBk/tbsW\\nDHX2lNcxszTf+XP9HXHWJlA2xBfP+Dk4gl4DnO2Y1xR0OSywE/QtvEbN5cY94ieu\\nn7CBy29AleMhmbnx9pw3NyxcFIAsEZHJoU4ZW9ulAJ/ogttSyAWeacW7eJGW31/Z\\n39cS+I4KXJgeGRI20RmpqfH0tuT+X5Da59YpjYxkbhSK3HYBVnNPhoJFUc2j5iKy\\nXLgkapu1xRnEJhw05kr4LCbud0NTvfecqSqa+59kuVc+zWmfTnGTYc0PXZ6Oa3rK\\n44UOmE6eAT5zd/ToleDO0VesN+EO7CXfRsm7HWGpABF5wNK3vIEF2uRr2VJMvgqS\\n9eNwhJyOzoca4xFSwCkc6dACGGkV+CqhufdFBhmcAsUotSxe3zmrBjqA0B/nxIvH\\nDVgOAMnVCe+Lmv8T0mFgqZSJdIUdKjnOLu/GRFhjDKIak4jeMBMTYpVnU+HhMHLq\\nuDiZkNEvEEGhBQmZuI8J55F/a6UURnxUwT3piyi3Pmr2IFD7ahBxPzOBCQARAQAB\\ntCdGZWRvcmEgKGVwZWw5KSA8ZXBlbEBmZWRvcmFwcm9qZWN0Lm9yZz6JAk4EEwEI\\nADgWIQT/itE0RZcQbs6BO5GKOHK/MihGfAUCYTeY6wIbDwULCQgHAgYVCgkICwIE\\nFgIDAQIeAQIXgAAKCRCKOHK/MihGfFX/EACBPWv20+ttYu1A5WvtHJPzwbj0U4yF\\n3zTQpBglQ2UfkRpYdipTlT3Ih6j5h2VmgRPtINCc/ZE28adrWpBoeFIS2YAKOCLC\\nnZYtHl2nCoLq1U7FSttUGsZ/t8uGCBgnugTfnIYcmlP1jKKA6RJAclK89evDQX5n\\nR9ZD+Cq3CBMlttvSTCht0qQVlwycedH8iWyYgP/mF0W35BIn7NuuZwWhgR00n/VG\\n4nbKPOzTWbsP45awcmivdrS74P6mL84WfkghipdmcoyVb1B8ZP4Y/Ke0RXOnLhNe\\nCfrXXvuW+Pvg2RTfwRDtehGQPAgXbmLmz2ZkV69RGIr54HJv84NDbqZovRTMr7gL\\n9k3ciCzXCiYQgM8yAyGHV0KEhFSQ1HV7gMnt9UmxbxBE2pGU7vu3CwjYga5DpwU7\\nw5wu1TmM5KgZtZvuWOTDnqDLf0cKoIbW8FeeCOn24elcj32bnQDuF9DPey1mqcvT\\n/yEo/Ushyz6CVYxN8DGgcy2M9JOsnmjDx02h6qgWGWDuKgb9jZrvRedpAQCeemEd\\nfhEs6ihqVxRFl16HxC4EVijybhAL76SsM2nbtIqW1apBQJQpXWtQwwdvgTVpdEtE\\nr4ArVJYX5LrswnWEQMOelugUG6S3ZjMfcyOa/O0364iY73vyVgaYK+2XtT2usMux\\nVL469Kj5m13T6w==\\n=Mjs/\\n-----END PGP PUBLIC KEY BLOCK-----\" \u003e /etc/pki/rpm-gpg/RPM-GPG-KEY-EPEL-9 \u0026\u0026 echo -e \"[epel]\\nname=Extra Packages for Enterprise Linux 9 - \\$basearch\\nmetalink=https://mirrors.fedoraproject.org/metalink?repo=epel-9\u0026arch=\\$basearch\u0026infra=\\$infra\u0026content=\\$contentdir\\nenabled=1\\ngpgcheck=1\\ngpgkey=file:///etc/pki/rpm-gpg/RPM-GPG-KEY-EPEL-9\" \u003e /etc/yum.repos.d/epel.repo; fi'\n\n# update the base image to allow forward-looking optimistic updates during the testing phase, with the added benefit of helping move closer to passing security scans.\n# -- excludes ansible so it remains at 2.9 tag as shipped with the base image\n# -- installs python3-passlib and python3-bcrypt for oauth-proxy interface\n# -- cleans up the cached data from dnf to keep the image as small as possible\nRUN dnf update -y --exclude=ansible* \u0026\u0026 dnf install -y python3-passlib python3-bcrypt \u0026\u0026 dnf clean all \u0026\u0026 rm -rf /var/cache/dnf\n\nCOPY requirements.yml ${HOME}/requirements.yml\nRUN ansible-galaxy collection install -r ${HOME}/requirements.yml \\\n \u0026\u0026 chmod -R ug+rwx ${HOME}/.ansible\n\n# switch back to user 1001 when running the base image (non-root)\nUSER 1001\n\n# copy in required artifacts for the operator\nCOPY watches.yaml ${HOME}/watches.yaml\nCOPY roles/ ${HOME}/roles/\n"},"strategy":{"type":"Docker","dockerStrategy":{"from":{"kind":"DockerImage","name":"quay.io/operator-framework/ansible-operator@sha256:9895727b7f66bb88fa4c6afdefc7eecf86e6b7c1293920f866a035da9decc58e"},"pullSecret":{"name":"builder-dockercfg-vcnqb"}}},"output":{"to":{"kind":"DockerImage","name":"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-operator:latest"},"pushSecret":{"name":"builder-dockercfg-vcnqb"}},"resources":{},"postCommit":{},"nodeSelector":null,"triggeredBy":[{"message":"Image change","imageChangeBuild":{"imageID":"quay.io/operator-framework/ansible-operator@sha256:9895727b7f66bb88fa4c6afdefc7eecf86e6b7c1293920f866a035da9decc58e","fromRef":{"kind":"ImageStreamTag","name":"ansible-operator:v1.38.1"}}}]},"status":{"phase":"New","outputDockerImageReference":"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-operator:latest","config":{"kind":"BuildConfig","namespace":"service-telemetry","name":"service-telemetry-operator"},"output":{},"conditions":[{"type":"New","status":"True","lastUpdateTime":"2026-03-19T00:21:44Z","lastTransitionTime":"2026-03-19T00:21:44Z"}]}} Mar 19 00:22:15 crc kubenswrapper[4745]: ,ValueFrom:nil,},EnvVar{Name:LANG,Value:C.utf8,ValueFrom:nil,},EnvVar{Name:BUILD_REGISTRIES_CONF_PATH,Value:/var/run/configs/openshift.io/build-system/registries.conf,ValueFrom:nil,},EnvVar{Name:BUILD_REGISTRIES_DIR_PATH,Value:/var/run/configs/openshift.io/build-system/registries.d,ValueFrom:nil,},EnvVar{Name:BUILD_SIGNATURE_POLICY_PATH,Value:/var/run/configs/openshift.io/build-system/policy.json,ValueFrom:nil,},EnvVar{Name:BUILD_STORAGE_CONF_PATH,Value:/var/run/configs/openshift.io/build-system/storage.conf,ValueFrom:nil,},EnvVar{Name:BUILD_BLOBCACHE_DIR,Value:/var/cache/blobs,ValueFrom:nil,},EnvVar{Name:HTTP_PROXY,Value:,ValueFrom:nil,},EnvVar{Name:http_proxy,Value:,ValueFrom:nil,},EnvVar{Name:HTTPS_PROXY,Value:,ValueFrom:nil,},EnvVar{Name:https_proxy,Value:,ValueFrom:nil,},EnvVar{Name:NO_PROXY,Value:,ValueFrom:nil,},EnvVar{Name:no_proxy,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:buildworkdir,ReadOnly:false,MountPath:/tmp/build,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:build-system-configs,ReadOnly:true,MountPath:/var/run/configs/openshift.io/build-system,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:build-ca-bundles,ReadOnly:false,MountPath:/var/run/configs/openshift.io/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:build-proxy-ca-bundles,ReadOnly:false,MountPath:/var/run/configs/openshift.io/pki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:build-blob-cache,ReadOnly:false,MountPath:/var/cache/blobs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bwpcd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[CHOWN DAC_OVERRIDE],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod service-telemetry-operator-1-build_service-telemetry(6edd3146-9ede-4a63-b72b-5987ed600bce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 19 00:22:15 crc kubenswrapper[4745]: > logger="UnhandledError" Mar 19 00:22:15 crc kubenswrapper[4745]: E0319 00:22:15.982415 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manage-dockerfile\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/service-telemetry-operator-1-build" podUID="6edd3146-9ede-4a63-b72b-5987ed600bce" Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.001983 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.184741 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hfvm\" (UniqueName: \"kubernetes.io/projected/2886a73a-35b3-4014-ab67-1b88fa88b4d8-kube-api-access-8hfvm\") pod \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.185268 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-utilities\") pod \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.185377 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-catalog-content\") pod \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.200409 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-utilities" (OuterVolumeSpecName: "utilities") pod "2886a73a-35b3-4014-ab67-1b88fa88b4d8" (UID: "2886a73a-35b3-4014-ab67-1b88fa88b4d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.293487 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2886a73a-35b3-4014-ab67-1b88fa88b4d8-kube-api-access-8hfvm" (OuterVolumeSpecName: "kube-api-access-8hfvm") pod "2886a73a-35b3-4014-ab67-1b88fa88b4d8" (UID: "2886a73a-35b3-4014-ab67-1b88fa88b4d8"). InnerVolumeSpecName "kube-api-access-8hfvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.294443 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hfvm\" (UniqueName: \"kubernetes.io/projected/2886a73a-35b3-4014-ab67-1b88fa88b4d8-kube-api-access-8hfvm\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.294466 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.323801 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2886a73a-35b3-4014-ab67-1b88fa88b4d8" (UID: "2886a73a-35b3-4014-ab67-1b88fa88b4d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.398003 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.829892 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.829841 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtlwq" event={"ID":"2886a73a-35b3-4014-ab67-1b88fa88b4d8","Type":"ContainerDied","Data":"160fc9f43fde151a57b67e2c8925b999f7f111696e51e7dd87c757005709ba3f"} Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.830994 4745 scope.go:117] "RemoveContainer" containerID="857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.052106 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mtlwq"] Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.060519 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mtlwq"] Mar 19 00:22:17 crc kubenswrapper[4745]: E0319 00:22:17.185109 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2886a73a_35b3_4014_ab67_1b88fa88b4d8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2886a73a_35b3_4014_ab67_1b88fa88b4d8.slice/crio-160fc9f43fde151a57b67e2c8925b999f7f111696e51e7dd87c757005709ba3f\": RecentStats: unable to find data in memory cache]" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.214648 4745 scope.go:117] "RemoveContainer" containerID="24e30658109b4babb9c20dba7ffc475b9dffa17f6513457f28d5dd1a43e9bcfb" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.254317 4745 scope.go:117] "RemoveContainer" containerID="4d2e067269ab8d6ad4ff9fd9c0c25e46f9248271c2c159ef4b71cf8df753032d" Mar 19 00:22:17 crc kubenswrapper[4745]: E0319 00:22:17.266656 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="registry.connect.redhat.com/elastic/elasticsearch:7.17.20" Mar 19 00:22:17 crc kubenswrapper[4745]: E0319 00:22:17.266924 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:elastic-internal-init-filesystem,Image:registry.connect.redhat.com/elastic/elasticsearch:7.17.20,Command:[bash -c /mnt/elastic-internal/scripts/prepare-fs.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:HEADLESS_SERVICE_NAME,Value:elasticsearch-es-default,ValueFrom:nil,},EnvVar{Name:PROBE_PASSWORD_PATH,Value:/mnt/elastic-internal/pod-mounted-users/elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:PROBE_USERNAME,Value:elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:READINESS_PROBE_PROTOCOL,Value:https,ValueFrom:nil,},EnvVar{Name:NSS_SDB_USE_CACHE,Value:no,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:downward-api,ReadOnly:true,MountPath:/mnt/elastic-internal/downward-api,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-bin-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-bin-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config,ReadOnly:true,MountPath:/mnt/elastic-internal/elasticsearch-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-config-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-plugins-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-plugins-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-http-certificates,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/http-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-probe-user,ReadOnly:true,MountPath:/mnt/elastic-internal/pod-mounted-users,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-remote-certificate-authorities,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/transport-remote-certs/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-scripts,ReadOnly:true,MountPath:/mnt/elastic-internal/scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-transport-certificates,ReadOnly:true,MountPath:/mnt/elastic-internal/transport-certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-unicast-hosts,ReadOnly:true,MountPath:/mnt/elastic-internal/unicast-hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-xpack-file-realm,ReadOnly:true,MountPath:/mnt/elastic-internal/xpack-file-realm,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-data,ReadOnly:false,MountPath:/usr/share/elasticsearch/data,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-logs,ReadOnly:false,MountPath:/usr/share/elasticsearch/logs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tmp-volume,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod elasticsearch-es-default-0_service-telemetry(884040c3-6c56-45b0-881d-e73f52c0ab34): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 00:22:17 crc kubenswrapper[4745]: E0319 00:22:17.268300 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="884040c3-6c56-45b0-881d-e73f52c0ab34" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.403339 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-cpzpl"] Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.449941 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.613981 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564662-znhfd"] Mar 19 00:22:17 crc kubenswrapper[4745]: W0319 00:22:17.619007 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeed3a0c_ada3_41b5_895b_8acc45926539.slice/crio-384224cc8f809ba00927bca54562c15dd639f37fb13daad965dbd985c43071e5 WatchSource:0}: Error finding container 384224cc8f809ba00927bca54562c15dd639f37fb13daad965dbd985c43071e5: Status 404 returned error can't find the container with id 384224cc8f809ba00927bca54562c15dd639f37fb13daad965dbd985c43071e5 Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621453 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-ca-bundles\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621500 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-buildworkdir\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621553 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-run\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621592 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-proxy-ca-bundles\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621637 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-node-pullsecrets\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621657 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-system-configs\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621691 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwpcd\" (UniqueName: \"kubernetes.io/projected/6edd3146-9ede-4a63-b72b-5987ed600bce-kube-api-access-bwpcd\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621729 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-push\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621757 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-pull\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621739 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621774 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-buildcachedir\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621805 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-root\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621822 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-build-blob-cache\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.622054 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.622075 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.622179 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.622192 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.622205 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.622228 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.622267 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.622340 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.622989 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.623673 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.625297 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.629043 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.629093 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.629119 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6edd3146-9ede-4a63-b72b-5987ed600bce-kube-api-access-bwpcd" (OuterVolumeSpecName: "kube-api-access-bwpcd") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "kube-api-access-bwpcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.688193 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-ptrd5"] Mar 19 00:22:17 crc kubenswrapper[4745]: W0319 00:22:17.690777 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93f48ad8_0863_4d90_abac_b887096b386c.slice/crio-b254ab1be0019ac2cfdcd3616e1d8a6f42bb2e1b3206f07749a8cbfc84261970 WatchSource:0}: Error finding container b254ab1be0019ac2cfdcd3616e1d8a6f42bb2e1b3206f07749a8cbfc84261970: Status 404 returned error can't find the container with id b254ab1be0019ac2cfdcd3616e1d8a6f42bb2e1b3206f07749a8cbfc84261970 Mar 19 00:22:17 crc kubenswrapper[4745]: W0319 00:22:17.691581 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f7ceaac_a9f7_467b_83c9_298813ff6323.slice/crio-7d520b94b5e21d79e89e7ffd2add00bde982738351afc3753fbe52b023944e2a WatchSource:0}: Error finding container 7d520b94b5e21d79e89e7ffd2add00bde982738351afc3753fbe52b023944e2a: Status 404 returned error can't find the container with id 7d520b94b5e21d79e89e7ffd2add00bde982738351afc3753fbe52b023944e2a Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.698603 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-vr5md"] Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.721382 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.724157 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.724187 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.724196 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwpcd\" (UniqueName: \"kubernetes.io/projected/6edd3146-9ede-4a63-b72b-5987ed600bce-kube-api-access-bwpcd\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.724206 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.724216 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.724224 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.724261 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.724270 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.724281 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: W0319 00:22:17.725608 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9628a478_fb27_4c42_bcf5_2a329898708b.slice/crio-58b8553adc548ce0bd4cb3a6183d89b4d03da6f1f5a7173d3c08d371d62ae978 WatchSource:0}: Error finding container 58b8553adc548ce0bd4cb3a6183d89b4d03da6f1f5a7173d3c08d371d62ae978: Status 404 returned error can't find the container with id 58b8553adc548ce0bd4cb3a6183d89b4d03da6f1f5a7173d3c08d371d62ae978 Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.836819 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"9628a478-fb27-4c42-bcf5-2a329898708b","Type":"ContainerStarted","Data":"58b8553adc548ce0bd4cb3a6183d89b4d03da6f1f5a7173d3c08d371d62ae978"} Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.839195 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" event={"ID":"9f7ceaac-a9f7-467b-83c9-298813ff6323","Type":"ContainerStarted","Data":"7d520b94b5e21d79e89e7ffd2add00bde982738351afc3753fbe52b023944e2a"} Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.843048 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" event={"ID":"bbe8b718-863a-404e-9be9-e872318f1ac0","Type":"ContainerStarted","Data":"a44f785d85411a8d3d7d159203cf5f427adb861697c350dd1de19f7a9fde89f7"} Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.845743 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.845742 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"6edd3146-9ede-4a63-b72b-5987ed600bce","Type":"ContainerDied","Data":"694ce7af8b3166a88da3080fef22e0db15c17a9128111ade8560fc1f11e9a583"} Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.847549 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564662-znhfd" event={"ID":"deed3a0c-ada3-41b5-895b-8acc45926539","Type":"ContainerStarted","Data":"384224cc8f809ba00927bca54562c15dd639f37fb13daad965dbd985c43071e5"} Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.848838 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-ptrd5" event={"ID":"93f48ad8-0863-4d90-abac-b887096b386c","Type":"ContainerStarted","Data":"b254ab1be0019ac2cfdcd3616e1d8a6f42bb2e1b3206f07749a8cbfc84261970"} Mar 19 00:22:17 crc kubenswrapper[4745]: E0319 00:22:17.850498 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="884040c3-6c56-45b0-881d-e73f52c0ab34" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.931264 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.937643 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 19 00:22:18 crc kubenswrapper[4745]: I0319 00:22:18.071732 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 19 00:22:18 crc kubenswrapper[4745]: I0319 00:22:18.103110 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 19 00:22:18 crc kubenswrapper[4745]: I0319 00:22:18.147937 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" path="/var/lib/kubelet/pods/2886a73a-35b3-4014-ab67-1b88fa88b4d8/volumes" Mar 19 00:22:18 crc kubenswrapper[4745]: I0319 00:22:18.149045 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6edd3146-9ede-4a63-b72b-5987ed600bce" path="/var/lib/kubelet/pods/6edd3146-9ede-4a63-b72b-5987ed600bce/volumes" Mar 19 00:22:18 crc kubenswrapper[4745]: I0319 00:22:18.861646 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"9628a478-fb27-4c42-bcf5-2a329898708b","Type":"ContainerStarted","Data":"dd061ee450110ddfede6713e7818b0934e91c5d3775e82e313f1bc2da43e1a88"} Mar 19 00:22:18 crc kubenswrapper[4745]: E0319 00:22:18.864348 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="884040c3-6c56-45b0-881d-e73f52c0ab34" Mar 19 00:22:19 crc kubenswrapper[4745]: I0319 00:22:19.871807 4745 generic.go:334] "Generic (PLEG): container finished" podID="deed3a0c-ada3-41b5-895b-8acc45926539" containerID="7d29ab0663977a94ba5c0f15b3cbd0ce7ec172f2fc28bc0ca2d89b44013b1e84" exitCode=0 Mar 19 00:22:19 crc kubenswrapper[4745]: I0319 00:22:19.871842 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564662-znhfd" event={"ID":"deed3a0c-ada3-41b5-895b-8acc45926539","Type":"ContainerDied","Data":"7d29ab0663977a94ba5c0f15b3cbd0ce7ec172f2fc28bc0ca2d89b44013b1e84"} Mar 19 00:22:19 crc kubenswrapper[4745]: E0319 00:22:19.874852 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="884040c3-6c56-45b0-881d-e73f52c0ab34" Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.249216 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564662-znhfd" Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.306284 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwsgv\" (UniqueName: \"kubernetes.io/projected/deed3a0c-ada3-41b5-895b-8acc45926539-kube-api-access-gwsgv\") pod \"deed3a0c-ada3-41b5-895b-8acc45926539\" (UID: \"deed3a0c-ada3-41b5-895b-8acc45926539\") " Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.313392 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deed3a0c-ada3-41b5-895b-8acc45926539-kube-api-access-gwsgv" (OuterVolumeSpecName: "kube-api-access-gwsgv") pod "deed3a0c-ada3-41b5-895b-8acc45926539" (UID: "deed3a0c-ada3-41b5-895b-8acc45926539"). InnerVolumeSpecName "kube-api-access-gwsgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.407928 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwsgv\" (UniqueName: \"kubernetes.io/projected/deed3a0c-ada3-41b5-895b-8acc45926539-kube-api-access-gwsgv\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.902504 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" event={"ID":"bbe8b718-863a-404e-9be9-e872318f1ac0","Type":"ContainerStarted","Data":"4a60e5014334d2bfaa36f00bc65a9db1478d4b51d8a1b3468b45b453adce650a"} Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.904279 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564662-znhfd" Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.904267 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564662-znhfd" event={"ID":"deed3a0c-ada3-41b5-895b-8acc45926539","Type":"ContainerDied","Data":"384224cc8f809ba00927bca54562c15dd639f37fb13daad965dbd985c43071e5"} Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.904395 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="384224cc8f809ba00927bca54562c15dd639f37fb13daad965dbd985c43071e5" Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.906007 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-ptrd5" event={"ID":"93f48ad8-0863-4d90-abac-b887096b386c","Type":"ContainerStarted","Data":"0ce1cf8b1ab7f41eba4be95314b4d5e70806db344dc2f5207a232338609b2723"} Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.907580 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" event={"ID":"9f7ceaac-a9f7-467b-83c9-298813ff6323","Type":"ContainerStarted","Data":"ed08192019c25324529343b9fe7d94e2e2866ff48cd56fb32836733501c7f185"} Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.907690 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.925149 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" podStartSLOduration=21.035819479 podStartE2EDuration="25.925122534s" podCreationTimestamp="2026-03-19 00:21:57 +0000 UTC" firstStartedPulling="2026-03-19 00:22:17.425036703 +0000 UTC m=+901.963231834" lastFinishedPulling="2026-03-19 00:22:22.314339758 +0000 UTC m=+906.852534889" observedRunningTime="2026-03-19 00:22:22.921533251 +0000 UTC m=+907.459728392" watchObservedRunningTime="2026-03-19 00:22:22.925122534 +0000 UTC m=+907.463317665" Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.952460 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-ptrd5" podStartSLOduration=12.302124814999999 podStartE2EDuration="16.952430695s" podCreationTimestamp="2026-03-19 00:22:06 +0000 UTC" firstStartedPulling="2026-03-19 00:22:17.69325549 +0000 UTC m=+902.231450621" lastFinishedPulling="2026-03-19 00:22:22.34356137 +0000 UTC m=+906.881756501" observedRunningTime="2026-03-19 00:22:22.95224128 +0000 UTC m=+907.490436411" watchObservedRunningTime="2026-03-19 00:22:22.952430695 +0000 UTC m=+907.490625826" Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.985698 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" podStartSLOduration=22.358471483 podStartE2EDuration="26.985671442s" podCreationTimestamp="2026-03-19 00:21:56 +0000 UTC" firstStartedPulling="2026-03-19 00:22:17.693538759 +0000 UTC m=+902.231733890" lastFinishedPulling="2026-03-19 00:22:22.320738718 +0000 UTC m=+906.858933849" observedRunningTime="2026-03-19 00:22:22.981292206 +0000 UTC m=+907.519487337" watchObservedRunningTime="2026-03-19 00:22:22.985671442 +0000 UTC m=+907.523866573" Mar 19 00:22:23 crc kubenswrapper[4745]: I0319 00:22:23.316720 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564656-7jksw"] Mar 19 00:22:23 crc kubenswrapper[4745]: I0319 00:22:23.347817 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564656-7jksw"] Mar 19 00:22:24 crc kubenswrapper[4745]: I0319 00:22:24.146783 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1" path="/var/lib/kubelet/pods/f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1/volumes" Mar 19 00:22:28 crc kubenswrapper[4745]: I0319 00:22:28.953684 4745 generic.go:334] "Generic (PLEG): container finished" podID="9628a478-fb27-4c42-bcf5-2a329898708b" containerID="dd061ee450110ddfede6713e7818b0934e91c5d3775e82e313f1bc2da43e1a88" exitCode=0 Mar 19 00:22:28 crc kubenswrapper[4745]: I0319 00:22:28.953817 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"9628a478-fb27-4c42-bcf5-2a329898708b","Type":"ContainerDied","Data":"dd061ee450110ddfede6713e7818b0934e91c5d3775e82e313f1bc2da43e1a88"} Mar 19 00:22:29 crc kubenswrapper[4745]: I0319 00:22:29.962293 4745 generic.go:334] "Generic (PLEG): container finished" podID="9628a478-fb27-4c42-bcf5-2a329898708b" containerID="c30d083389fd186a7069c7aaf9d05af719cf1d4cce9883d54a6420598b98e1e5" exitCode=0 Mar 19 00:22:29 crc kubenswrapper[4745]: I0319 00:22:29.962390 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"9628a478-fb27-4c42-bcf5-2a329898708b","Type":"ContainerDied","Data":"c30d083389fd186a7069c7aaf9d05af719cf1d4cce9883d54a6420598b98e1e5"} Mar 19 00:22:30 crc kubenswrapper[4745]: I0319 00:22:30.023488 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_9628a478-fb27-4c42-bcf5-2a329898708b/manage-dockerfile/0.log" Mar 19 00:22:30 crc kubenswrapper[4745]: I0319 00:22:30.972462 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"9628a478-fb27-4c42-bcf5-2a329898708b","Type":"ContainerStarted","Data":"b4703b83174ab7b2103b6b149b2ad8cb67a489bfe97d982169000bed076edd3c"} Mar 19 00:22:31 crc kubenswrapper[4745]: I0319 00:22:31.006727 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-2-build" podStartSLOduration=33.482700093 podStartE2EDuration="34.006704941s" podCreationTimestamp="2026-03-19 00:21:57 +0000 UTC" firstStartedPulling="2026-03-19 00:22:17.729008046 +0000 UTC m=+902.267203177" lastFinishedPulling="2026-03-19 00:22:18.253012894 +0000 UTC m=+902.791208025" observedRunningTime="2026-03-19 00:22:31.004486352 +0000 UTC m=+915.542681483" watchObservedRunningTime="2026-03-19 00:22:31.006704941 +0000 UTC m=+915.544900072" Mar 19 00:22:31 crc kubenswrapper[4745]: I0319 00:22:31.964985 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" Mar 19 00:22:34 crc kubenswrapper[4745]: I0319 00:22:34.998239 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"884040c3-6c56-45b0-881d-e73f52c0ab34","Type":"ContainerStarted","Data":"31a8a2e6d26279247a06c5389e7cc4810d760e5b6b34dc49c80127d9e15a8c71"} Mar 19 00:22:35 crc kubenswrapper[4745]: I0319 00:22:35.001312 4745 scope.go:117] "RemoveContainer" containerID="4cf1138f66461b0db8f8d82a562c5595a0d75aaab97369e7761de279cdf0fb9b" Mar 19 00:22:36 crc kubenswrapper[4745]: I0319 00:22:36.008666 4745 generic.go:334] "Generic (PLEG): container finished" podID="884040c3-6c56-45b0-881d-e73f52c0ab34" containerID="31a8a2e6d26279247a06c5389e7cc4810d760e5b6b34dc49c80127d9e15a8c71" exitCode=0 Mar 19 00:22:36 crc kubenswrapper[4745]: I0319 00:22:36.008757 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"884040c3-6c56-45b0-881d-e73f52c0ab34","Type":"ContainerDied","Data":"31a8a2e6d26279247a06c5389e7cc4810d760e5b6b34dc49c80127d9e15a8c71"} Mar 19 00:22:37 crc kubenswrapper[4745]: I0319 00:22:37.190080 4745 generic.go:334] "Generic (PLEG): container finished" podID="884040c3-6c56-45b0-881d-e73f52c0ab34" containerID="759c92c7a60155345946936f340a97217251cee94b6ed490cd8f7fb3932d4668" exitCode=0 Mar 19 00:22:37 crc kubenswrapper[4745]: I0319 00:22:37.190158 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"884040c3-6c56-45b0-881d-e73f52c0ab34","Type":"ContainerDied","Data":"759c92c7a60155345946936f340a97217251cee94b6ed490cd8f7fb3932d4668"} Mar 19 00:22:38 crc kubenswrapper[4745]: I0319 00:22:38.198638 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"884040c3-6c56-45b0-881d-e73f52c0ab34","Type":"ContainerStarted","Data":"f7ef869940915ea60ea453e1e20afbd2d82e3f9a1d01d7292debb45a26fd780c"} Mar 19 00:22:38 crc kubenswrapper[4745]: I0319 00:22:38.198959 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:22:38 crc kubenswrapper[4745]: I0319 00:22:38.241697 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=19.546775413 podStartE2EDuration="1m20.241667527s" podCreationTimestamp="2026-03-19 00:21:18 +0000 UTC" firstStartedPulling="2026-03-19 00:21:33.227490461 +0000 UTC m=+857.765685602" lastFinishedPulling="2026-03-19 00:22:33.922382585 +0000 UTC m=+918.460577716" observedRunningTime="2026-03-19 00:22:38.232400758 +0000 UTC m=+922.770595919" watchObservedRunningTime="2026-03-19 00:22:38.241667527 +0000 UTC m=+922.779862678" Mar 19 00:22:49 crc kubenswrapper[4745]: I0319 00:22:49.205140 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="884040c3-6c56-45b0-881d-e73f52c0ab34" containerName="elasticsearch" probeResult="failure" output=< Mar 19 00:22:49 crc kubenswrapper[4745]: {"timestamp": "2026-03-19T00:22:49+00:00", "message": "readiness probe failed", "curl_rc": "7"} Mar 19 00:22:49 crc kubenswrapper[4745]: > Mar 19 00:22:54 crc kubenswrapper[4745]: I0319 00:22:54.403841 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="884040c3-6c56-45b0-881d-e73f52c0ab34" containerName="elasticsearch" probeResult="failure" output=< Mar 19 00:22:54 crc kubenswrapper[4745]: {"timestamp": "2026-03-19T00:22:54+00:00", "message": "readiness probe failed", "curl_rc": "7"} Mar 19 00:22:54 crc kubenswrapper[4745]: > Mar 19 00:22:59 crc kubenswrapper[4745]: I0319 00:22:59.186870 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="884040c3-6c56-45b0-881d-e73f52c0ab34" containerName="elasticsearch" probeResult="failure" output=< Mar 19 00:22:59 crc kubenswrapper[4745]: {"timestamp": "2026-03-19T00:22:59+00:00", "message": "readiness probe failed", "curl_rc": "7"} Mar 19 00:22:59 crc kubenswrapper[4745]: > Mar 19 00:23:04 crc kubenswrapper[4745]: I0319 00:23:04.578692 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.266856 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b6sd9"] Mar 19 00:23:40 crc kubenswrapper[4745]: E0319 00:23:40.267925 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="registry-server" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.267943 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="registry-server" Mar 19 00:23:40 crc kubenswrapper[4745]: E0319 00:23:40.267959 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="extract-utilities" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.267968 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="extract-utilities" Mar 19 00:23:40 crc kubenswrapper[4745]: E0319 00:23:40.267994 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="extract-content" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.268002 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="extract-content" Mar 19 00:23:40 crc kubenswrapper[4745]: E0319 00:23:40.268014 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deed3a0c-ada3-41b5-895b-8acc45926539" containerName="oc" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.268020 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="deed3a0c-ada3-41b5-895b-8acc45926539" containerName="oc" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.268154 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="deed3a0c-ada3-41b5-895b-8acc45926539" containerName="oc" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.268179 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="registry-server" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.269200 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.285094 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b6sd9"] Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.390145 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-catalog-content\") pod \"community-operators-b6sd9\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.390224 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-utilities\") pod \"community-operators-b6sd9\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.390308 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jc9c\" (UniqueName: \"kubernetes.io/projected/23856082-7489-4bd9-8561-9492d211f62f-kube-api-access-9jc9c\") pod \"community-operators-b6sd9\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.491093 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jc9c\" (UniqueName: \"kubernetes.io/projected/23856082-7489-4bd9-8561-9492d211f62f-kube-api-access-9jc9c\") pod \"community-operators-b6sd9\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.491472 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-catalog-content\") pod \"community-operators-b6sd9\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.491582 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-utilities\") pod \"community-operators-b6sd9\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.492019 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-catalog-content\") pod \"community-operators-b6sd9\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.492043 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-utilities\") pod \"community-operators-b6sd9\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.511375 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jc9c\" (UniqueName: \"kubernetes.io/projected/23856082-7489-4bd9-8561-9492d211f62f-kube-api-access-9jc9c\") pod \"community-operators-b6sd9\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.588003 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.928716 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b6sd9"] Mar 19 00:23:41 crc kubenswrapper[4745]: I0319 00:23:41.838652 4745 generic.go:334] "Generic (PLEG): container finished" podID="23856082-7489-4bd9-8561-9492d211f62f" containerID="c31f53226edb12a4c47ac12a63ba58b187d38f0846755a55d90f6873ac538535" exitCode=0 Mar 19 00:23:41 crc kubenswrapper[4745]: I0319 00:23:41.838711 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6sd9" event={"ID":"23856082-7489-4bd9-8561-9492d211f62f","Type":"ContainerDied","Data":"c31f53226edb12a4c47ac12a63ba58b187d38f0846755a55d90f6873ac538535"} Mar 19 00:23:41 crc kubenswrapper[4745]: I0319 00:23:41.838750 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6sd9" event={"ID":"23856082-7489-4bd9-8561-9492d211f62f","Type":"ContainerStarted","Data":"e8fbd88069b4baabfb51a65627fb36b764b5adb30577d5155d7c9e993c63fb41"} Mar 19 00:23:42 crc kubenswrapper[4745]: I0319 00:23:42.846132 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6sd9" event={"ID":"23856082-7489-4bd9-8561-9492d211f62f","Type":"ContainerStarted","Data":"ff7a3fc1461925c1f307f9e0214f0f9f539b3c62a6d1fccb088ba8d69a4dd25b"} Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.044937 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s8x87"] Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.047041 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.062537 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s8x87"] Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.233559 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b-utilities\") pod \"certified-operators-s8x87\" (UID: \"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b\") " pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.233769 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b-catalog-content\") pod \"certified-operators-s8x87\" (UID: \"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b\") " pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.233854 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7wh5\" (UniqueName: \"kubernetes.io/projected/7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b-kube-api-access-r7wh5\") pod \"certified-operators-s8x87\" (UID: \"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b\") " pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.335183 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7wh5\" (UniqueName: \"kubernetes.io/projected/7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b-kube-api-access-r7wh5\") pod \"certified-operators-s8x87\" (UID: \"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b\") " pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.335268 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b-utilities\") pod \"certified-operators-s8x87\" (UID: \"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b\") " pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.335355 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b-catalog-content\") pod \"certified-operators-s8x87\" (UID: \"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b\") " pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.335752 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b-utilities\") pod \"certified-operators-s8x87\" (UID: \"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b\") " pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.336009 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b-catalog-content\") pod \"certified-operators-s8x87\" (UID: \"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b\") " pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.360279 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7wh5\" (UniqueName: \"kubernetes.io/projected/7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b-kube-api-access-r7wh5\") pod \"certified-operators-s8x87\" (UID: \"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b\") " pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.366088 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.857414 4745 generic.go:334] "Generic (PLEG): container finished" podID="23856082-7489-4bd9-8561-9492d211f62f" containerID="ff7a3fc1461925c1f307f9e0214f0f9f539b3c62a6d1fccb088ba8d69a4dd25b" exitCode=0 Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.857897 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6sd9" event={"ID":"23856082-7489-4bd9-8561-9492d211f62f","Type":"ContainerDied","Data":"ff7a3fc1461925c1f307f9e0214f0f9f539b3c62a6d1fccb088ba8d69a4dd25b"} Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.894552 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s8x87"] Mar 19 00:23:44 crc kubenswrapper[4745]: I0319 00:23:44.887289 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6sd9" event={"ID":"23856082-7489-4bd9-8561-9492d211f62f","Type":"ContainerStarted","Data":"b56c754a942cf08b52d13bb31750aab1a9f071be41bf7e22cb0202eb991d6064"} Mar 19 00:23:44 crc kubenswrapper[4745]: I0319 00:23:44.890070 4745 generic.go:334] "Generic (PLEG): container finished" podID="7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b" containerID="7bd9ccf914f960b4e9d8c85316c1ce12252a1497a9241b7b47ad5214ad9c9cc1" exitCode=0 Mar 19 00:23:44 crc kubenswrapper[4745]: I0319 00:23:44.890164 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8x87" event={"ID":"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b","Type":"ContainerDied","Data":"7bd9ccf914f960b4e9d8c85316c1ce12252a1497a9241b7b47ad5214ad9c9cc1"} Mar 19 00:23:44 crc kubenswrapper[4745]: I0319 00:23:44.891068 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8x87" event={"ID":"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b","Type":"ContainerStarted","Data":"f0786173a026e18ad20f4f30038dc776483d295db9841f6a9b8b8bd9268114cd"} Mar 19 00:23:44 crc kubenswrapper[4745]: I0319 00:23:44.912658 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b6sd9" podStartSLOduration=2.474492893 podStartE2EDuration="4.912636918s" podCreationTimestamp="2026-03-19 00:23:40 +0000 UTC" firstStartedPulling="2026-03-19 00:23:41.840766245 +0000 UTC m=+986.378961376" lastFinishedPulling="2026-03-19 00:23:44.27891027 +0000 UTC m=+988.817105401" observedRunningTime="2026-03-19 00:23:44.909304204 +0000 UTC m=+989.447499345" watchObservedRunningTime="2026-03-19 00:23:44.912636918 +0000 UTC m=+989.450832059" Mar 19 00:23:45 crc kubenswrapper[4745]: I0319 00:23:45.606478 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:23:45 crc kubenswrapper[4745]: I0319 00:23:45.606571 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:23:50 crc kubenswrapper[4745]: I0319 00:23:50.589045 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:50 crc kubenswrapper[4745]: I0319 00:23:50.589813 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:50 crc kubenswrapper[4745]: I0319 00:23:50.781736 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:51 crc kubenswrapper[4745]: I0319 00:23:51.028527 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:51 crc kubenswrapper[4745]: I0319 00:23:51.098757 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b6sd9"] Mar 19 00:23:53 crc kubenswrapper[4745]: I0319 00:23:53.006151 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b6sd9" podUID="23856082-7489-4bd9-8561-9492d211f62f" containerName="registry-server" containerID="cri-o://b56c754a942cf08b52d13bb31750aab1a9f071be41bf7e22cb0202eb991d6064" gracePeriod=2 Mar 19 00:23:54 crc kubenswrapper[4745]: I0319 00:23:54.017249 4745 generic.go:334] "Generic (PLEG): container finished" podID="23856082-7489-4bd9-8561-9492d211f62f" containerID="b56c754a942cf08b52d13bb31750aab1a9f071be41bf7e22cb0202eb991d6064" exitCode=0 Mar 19 00:23:54 crc kubenswrapper[4745]: I0319 00:23:54.017322 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6sd9" event={"ID":"23856082-7489-4bd9-8561-9492d211f62f","Type":"ContainerDied","Data":"b56c754a942cf08b52d13bb31750aab1a9f071be41bf7e22cb0202eb991d6064"} Mar 19 00:23:55 crc kubenswrapper[4745]: I0319 00:23:55.966116 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.093699 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-utilities\") pod \"23856082-7489-4bd9-8561-9492d211f62f\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.093871 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jc9c\" (UniqueName: \"kubernetes.io/projected/23856082-7489-4bd9-8561-9492d211f62f-kube-api-access-9jc9c\") pod \"23856082-7489-4bd9-8561-9492d211f62f\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.093928 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-catalog-content\") pod \"23856082-7489-4bd9-8561-9492d211f62f\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.098445 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-utilities" (OuterVolumeSpecName: "utilities") pod "23856082-7489-4bd9-8561-9492d211f62f" (UID: "23856082-7489-4bd9-8561-9492d211f62f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.100113 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.100126 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6sd9" event={"ID":"23856082-7489-4bd9-8561-9492d211f62f","Type":"ContainerDied","Data":"e8fbd88069b4baabfb51a65627fb36b764b5adb30577d5155d7c9e993c63fb41"} Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.100197 4745 scope.go:117] "RemoveContainer" containerID="b56c754a942cf08b52d13bb31750aab1a9f071be41bf7e22cb0202eb991d6064" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.105013 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8x87" event={"ID":"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b","Type":"ContainerStarted","Data":"b1f131ab406d856dab787928a0854b0baa6f609e17dd782cdce42e98836eb0fb"} Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.107178 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23856082-7489-4bd9-8561-9492d211f62f-kube-api-access-9jc9c" (OuterVolumeSpecName: "kube-api-access-9jc9c") pod "23856082-7489-4bd9-8561-9492d211f62f" (UID: "23856082-7489-4bd9-8561-9492d211f62f"). InnerVolumeSpecName "kube-api-access-9jc9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.135260 4745 scope.go:117] "RemoveContainer" containerID="ff7a3fc1461925c1f307f9e0214f0f9f539b3c62a6d1fccb088ba8d69a4dd25b" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.156605 4745 scope.go:117] "RemoveContainer" containerID="c31f53226edb12a4c47ac12a63ba58b187d38f0846755a55d90f6873ac538535" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.170115 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23856082-7489-4bd9-8561-9492d211f62f" (UID: "23856082-7489-4bd9-8561-9492d211f62f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.195945 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.195993 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jc9c\" (UniqueName: \"kubernetes.io/projected/23856082-7489-4bd9-8561-9492d211f62f-kube-api-access-9jc9c\") on node \"crc\" DevicePath \"\"" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.196004 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.433518 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b6sd9"] Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.447852 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b6sd9"] Mar 19 00:23:57 crc kubenswrapper[4745]: I0319 00:23:57.115851 4745 generic.go:334] "Generic (PLEG): container finished" podID="7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b" containerID="b1f131ab406d856dab787928a0854b0baa6f609e17dd782cdce42e98836eb0fb" exitCode=0 Mar 19 00:23:57 crc kubenswrapper[4745]: I0319 00:23:57.115929 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8x87" event={"ID":"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b","Type":"ContainerDied","Data":"b1f131ab406d856dab787928a0854b0baa6f609e17dd782cdce42e98836eb0fb"} Mar 19 00:23:58 crc kubenswrapper[4745]: I0319 00:23:58.147729 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23856082-7489-4bd9-8561-9492d211f62f" path="/var/lib/kubelet/pods/23856082-7489-4bd9-8561-9492d211f62f/volumes" Mar 19 00:23:59 crc kubenswrapper[4745]: I0319 00:23:59.135580 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8x87" event={"ID":"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b","Type":"ContainerStarted","Data":"f0ace3c9cf8d5ce03926099d2106cc64f7295db6ffdd337710c9734b56b66f65"} Mar 19 00:23:59 crc kubenswrapper[4745]: I0319 00:23:59.159556 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s8x87" podStartSLOduration=2.8302934950000003 podStartE2EDuration="16.159529669s" podCreationTimestamp="2026-03-19 00:23:43 +0000 UTC" firstStartedPulling="2026-03-19 00:23:44.899662986 +0000 UTC m=+989.437858117" lastFinishedPulling="2026-03-19 00:23:58.22889916 +0000 UTC m=+1002.767094291" observedRunningTime="2026-03-19 00:23:59.156771583 +0000 UTC m=+1003.694966724" watchObservedRunningTime="2026-03-19 00:23:59.159529669 +0000 UTC m=+1003.697724850" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.148297 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564664-hhtnq"] Mar 19 00:24:00 crc kubenswrapper[4745]: E0319 00:24:00.148795 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23856082-7489-4bd9-8561-9492d211f62f" containerName="extract-content" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.148812 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="23856082-7489-4bd9-8561-9492d211f62f" containerName="extract-content" Mar 19 00:24:00 crc kubenswrapper[4745]: E0319 00:24:00.148832 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23856082-7489-4bd9-8561-9492d211f62f" containerName="registry-server" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.148839 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="23856082-7489-4bd9-8561-9492d211f62f" containerName="registry-server" Mar 19 00:24:00 crc kubenswrapper[4745]: E0319 00:24:00.148852 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23856082-7489-4bd9-8561-9492d211f62f" containerName="extract-utilities" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.148858 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="23856082-7489-4bd9-8561-9492d211f62f" containerName="extract-utilities" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.148986 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="23856082-7489-4bd9-8561-9492d211f62f" containerName="registry-server" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.149466 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564664-hhtnq" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.151969 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.152949 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.154287 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.168094 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564664-hhtnq"] Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.253744 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zj58\" (UniqueName: \"kubernetes.io/projected/d9a8819f-c57d-463c-9089-fbf3b29e12bc-kube-api-access-4zj58\") pod \"auto-csr-approver-29564664-hhtnq\" (UID: \"d9a8819f-c57d-463c-9089-fbf3b29e12bc\") " pod="openshift-infra/auto-csr-approver-29564664-hhtnq" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.355372 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zj58\" (UniqueName: \"kubernetes.io/projected/d9a8819f-c57d-463c-9089-fbf3b29e12bc-kube-api-access-4zj58\") pod \"auto-csr-approver-29564664-hhtnq\" (UID: \"d9a8819f-c57d-463c-9089-fbf3b29e12bc\") " pod="openshift-infra/auto-csr-approver-29564664-hhtnq" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.400549 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zj58\" (UniqueName: \"kubernetes.io/projected/d9a8819f-c57d-463c-9089-fbf3b29e12bc-kube-api-access-4zj58\") pod \"auto-csr-approver-29564664-hhtnq\" (UID: \"d9a8819f-c57d-463c-9089-fbf3b29e12bc\") " pod="openshift-infra/auto-csr-approver-29564664-hhtnq" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.467732 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564664-hhtnq" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.744278 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564664-hhtnq"] Mar 19 00:24:01 crc kubenswrapper[4745]: I0319 00:24:01.149440 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564664-hhtnq" event={"ID":"d9a8819f-c57d-463c-9089-fbf3b29e12bc","Type":"ContainerStarted","Data":"742098faa80518f702910fff7c9cc0a70dc0fad59e218550a4356a422ed77c0a"} Mar 19 00:24:03 crc kubenswrapper[4745]: I0319 00:24:03.165538 4745 generic.go:334] "Generic (PLEG): container finished" podID="d9a8819f-c57d-463c-9089-fbf3b29e12bc" containerID="102c360c5a32588e5b65407ade841670a288ba3942421d8abc329207a20bc972" exitCode=0 Mar 19 00:24:03 crc kubenswrapper[4745]: I0319 00:24:03.165642 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564664-hhtnq" event={"ID":"d9a8819f-c57d-463c-9089-fbf3b29e12bc","Type":"ContainerDied","Data":"102c360c5a32588e5b65407ade841670a288ba3942421d8abc329207a20bc972"} Mar 19 00:24:03 crc kubenswrapper[4745]: I0319 00:24:03.366928 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:24:03 crc kubenswrapper[4745]: I0319 00:24:03.366991 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:24:03 crc kubenswrapper[4745]: I0319 00:24:03.412820 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.214568 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.328366 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s8x87"] Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.371951 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vj7rp"] Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.372297 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vj7rp" podUID="fa950165-f194-4022-8333-581d7681fc74" containerName="registry-server" containerID="cri-o://d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f" gracePeriod=2 Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.773568 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.930733 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-catalog-content\") pod \"fa950165-f194-4022-8333-581d7681fc74\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.930875 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-utilities\") pod \"fa950165-f194-4022-8333-581d7681fc74\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.930949 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6z2b\" (UniqueName: \"kubernetes.io/projected/fa950165-f194-4022-8333-581d7681fc74-kube-api-access-q6z2b\") pod \"fa950165-f194-4022-8333-581d7681fc74\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.933173 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-utilities" (OuterVolumeSpecName: "utilities") pod "fa950165-f194-4022-8333-581d7681fc74" (UID: "fa950165-f194-4022-8333-581d7681fc74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.938314 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa950165-f194-4022-8333-581d7681fc74-kube-api-access-q6z2b" (OuterVolumeSpecName: "kube-api-access-q6z2b") pod "fa950165-f194-4022-8333-581d7681fc74" (UID: "fa950165-f194-4022-8333-581d7681fc74"). InnerVolumeSpecName "kube-api-access-q6z2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.986633 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa950165-f194-4022-8333-581d7681fc74" (UID: "fa950165-f194-4022-8333-581d7681fc74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.033068 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.033126 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6z2b\" (UniqueName: \"kubernetes.io/projected/fa950165-f194-4022-8333-581d7681fc74-kube-api-access-q6z2b\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.033143 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.183069 4745 generic.go:334] "Generic (PLEG): container finished" podID="fa950165-f194-4022-8333-581d7681fc74" containerID="d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f" exitCode=0 Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.183141 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.183157 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vj7rp" event={"ID":"fa950165-f194-4022-8333-581d7681fc74","Type":"ContainerDied","Data":"d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f"} Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.183211 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vj7rp" event={"ID":"fa950165-f194-4022-8333-581d7681fc74","Type":"ContainerDied","Data":"c186af9e1f5669e5491c37e499f3a6a8a28b64cddfa66b87effddfaec8dbd826"} Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.183233 4745 scope.go:117] "RemoveContainer" containerID="d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.200742 4745 scope.go:117] "RemoveContainer" containerID="e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.227046 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vj7rp"] Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.227181 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vj7rp"] Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.247068 4745 scope.go:117] "RemoveContainer" containerID="04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.260002 4745 scope.go:117] "RemoveContainer" containerID="d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f" Mar 19 00:24:05 crc kubenswrapper[4745]: E0319 00:24:05.260434 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f\": container with ID starting with d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f not found: ID does not exist" containerID="d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.260494 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f"} err="failed to get container status \"d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f\": rpc error: code = NotFound desc = could not find container \"d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f\": container with ID starting with d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f not found: ID does not exist" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.260527 4745 scope.go:117] "RemoveContainer" containerID="e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0" Mar 19 00:24:05 crc kubenswrapper[4745]: E0319 00:24:05.260948 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0\": container with ID starting with e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0 not found: ID does not exist" containerID="e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.260983 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0"} err="failed to get container status \"e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0\": rpc error: code = NotFound desc = could not find container \"e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0\": container with ID starting with e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0 not found: ID does not exist" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.261008 4745 scope.go:117] "RemoveContainer" containerID="04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc" Mar 19 00:24:05 crc kubenswrapper[4745]: E0319 00:24:05.261328 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc\": container with ID starting with 04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc not found: ID does not exist" containerID="04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.261361 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc"} err="failed to get container status \"04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc\": rpc error: code = NotFound desc = could not find container \"04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc\": container with ID starting with 04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc not found: ID does not exist" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.414367 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564664-hhtnq" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.539001 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zj58\" (UniqueName: \"kubernetes.io/projected/d9a8819f-c57d-463c-9089-fbf3b29e12bc-kube-api-access-4zj58\") pod \"d9a8819f-c57d-463c-9089-fbf3b29e12bc\" (UID: \"d9a8819f-c57d-463c-9089-fbf3b29e12bc\") " Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.554449 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9a8819f-c57d-463c-9089-fbf3b29e12bc-kube-api-access-4zj58" (OuterVolumeSpecName: "kube-api-access-4zj58") pod "d9a8819f-c57d-463c-9089-fbf3b29e12bc" (UID: "d9a8819f-c57d-463c-9089-fbf3b29e12bc"). InnerVolumeSpecName "kube-api-access-4zj58". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.640807 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zj58\" (UniqueName: \"kubernetes.io/projected/d9a8819f-c57d-463c-9089-fbf3b29e12bc-kube-api-access-4zj58\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:06 crc kubenswrapper[4745]: I0319 00:24:06.146006 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa950165-f194-4022-8333-581d7681fc74" path="/var/lib/kubelet/pods/fa950165-f194-4022-8333-581d7681fc74/volumes" Mar 19 00:24:06 crc kubenswrapper[4745]: I0319 00:24:06.191963 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564664-hhtnq" Mar 19 00:24:06 crc kubenswrapper[4745]: I0319 00:24:06.191989 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564664-hhtnq" event={"ID":"d9a8819f-c57d-463c-9089-fbf3b29e12bc","Type":"ContainerDied","Data":"742098faa80518f702910fff7c9cc0a70dc0fad59e218550a4356a422ed77c0a"} Mar 19 00:24:06 crc kubenswrapper[4745]: I0319 00:24:06.192499 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="742098faa80518f702910fff7c9cc0a70dc0fad59e218550a4356a422ed77c0a" Mar 19 00:24:06 crc kubenswrapper[4745]: I0319 00:24:06.492522 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564658-6vdd5"] Mar 19 00:24:06 crc kubenswrapper[4745]: I0319 00:24:06.496989 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564658-6vdd5"] Mar 19 00:24:08 crc kubenswrapper[4745]: I0319 00:24:08.146176 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7807a7d0-ff52-4a76-b083-19eca144b510" path="/var/lib/kubelet/pods/7807a7d0-ff52-4a76-b083-19eca144b510/volumes" Mar 19 00:24:15 crc kubenswrapper[4745]: I0319 00:24:15.606780 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:24:15 crc kubenswrapper[4745]: I0319 00:24:15.607659 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:24:21 crc kubenswrapper[4745]: I0319 00:24:21.294297 4745 generic.go:334] "Generic (PLEG): container finished" podID="9628a478-fb27-4c42-bcf5-2a329898708b" containerID="b4703b83174ab7b2103b6b149b2ad8cb67a489bfe97d982169000bed076edd3c" exitCode=0 Mar 19 00:24:21 crc kubenswrapper[4745]: I0319 00:24:21.294384 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"9628a478-fb27-4c42-bcf5-2a329898708b","Type":"ContainerDied","Data":"b4703b83174ab7b2103b6b149b2ad8cb67a489bfe97d982169000bed076edd3c"} Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.573748 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698305 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-ca-bundles\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698366 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-buildworkdir\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698399 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-system-configs\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698420 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-pull\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698447 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-run\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698487 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-node-pullsecrets\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698508 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-push\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698537 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-proxy-ca-bundles\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698562 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-root\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698682 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq4w5\" (UniqueName: \"kubernetes.io/projected/9628a478-fb27-4c42-bcf5-2a329898708b-kube-api-access-zq4w5\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698718 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-build-blob-cache\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698721 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698815 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-buildcachedir\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.699060 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.699166 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.699184 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.699763 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.700248 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.700628 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.700793 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.708068 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.708102 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.708121 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9628a478-fb27-4c42-bcf5-2a329898708b-kube-api-access-zq4w5" (OuterVolumeSpecName: "kube-api-access-zq4w5") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "kube-api-access-zq4w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.734637 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.800737 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.800772 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.800783 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.800792 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.800801 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.800809 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.800819 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.800828 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq4w5\" (UniqueName: \"kubernetes.io/projected/9628a478-fb27-4c42-bcf5-2a329898708b-kube-api-access-zq4w5\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.870977 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.901637 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:23 crc kubenswrapper[4745]: I0319 00:24:23.311461 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"9628a478-fb27-4c42-bcf5-2a329898708b","Type":"ContainerDied","Data":"58b8553adc548ce0bd4cb3a6183d89b4d03da6f1f5a7173d3c08d371d62ae978"} Mar 19 00:24:23 crc kubenswrapper[4745]: I0319 00:24:23.311516 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58b8553adc548ce0bd4cb3a6183d89b4d03da6f1f5a7173d3c08d371d62ae978" Mar 19 00:24:23 crc kubenswrapper[4745]: I0319 00:24:23.311560 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:24:24 crc kubenswrapper[4745]: I0319 00:24:24.494469 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:24:24 crc kubenswrapper[4745]: I0319 00:24:24.528615 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.109502 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 19 00:24:27 crc kubenswrapper[4745]: E0319 00:24:27.110479 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9628a478-fb27-4c42-bcf5-2a329898708b" containerName="docker-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.110496 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9628a478-fb27-4c42-bcf5-2a329898708b" containerName="docker-build" Mar 19 00:24:27 crc kubenswrapper[4745]: E0319 00:24:27.110508 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa950165-f194-4022-8333-581d7681fc74" containerName="registry-server" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.110515 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa950165-f194-4022-8333-581d7681fc74" containerName="registry-server" Mar 19 00:24:27 crc kubenswrapper[4745]: E0319 00:24:27.110525 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9628a478-fb27-4c42-bcf5-2a329898708b" containerName="git-clone" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.110535 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9628a478-fb27-4c42-bcf5-2a329898708b" containerName="git-clone" Mar 19 00:24:27 crc kubenswrapper[4745]: E0319 00:24:27.110549 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa950165-f194-4022-8333-581d7681fc74" containerName="extract-utilities" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.110555 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa950165-f194-4022-8333-581d7681fc74" containerName="extract-utilities" Mar 19 00:24:27 crc kubenswrapper[4745]: E0319 00:24:27.110575 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a8819f-c57d-463c-9089-fbf3b29e12bc" containerName="oc" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.110582 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a8819f-c57d-463c-9089-fbf3b29e12bc" containerName="oc" Mar 19 00:24:27 crc kubenswrapper[4745]: E0319 00:24:27.110593 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa950165-f194-4022-8333-581d7681fc74" containerName="extract-content" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.110601 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa950165-f194-4022-8333-581d7681fc74" containerName="extract-content" Mar 19 00:24:27 crc kubenswrapper[4745]: E0319 00:24:27.110611 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9628a478-fb27-4c42-bcf5-2a329898708b" containerName="manage-dockerfile" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.110618 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9628a478-fb27-4c42-bcf5-2a329898708b" containerName="manage-dockerfile" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.110747 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa950165-f194-4022-8333-581d7681fc74" containerName="registry-server" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.110773 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9a8819f-c57d-463c-9089-fbf3b29e12bc" containerName="oc" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.110783 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="9628a478-fb27-4c42-bcf5-2a329898708b" containerName="docker-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.111638 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.115378 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-ca" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.115448 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-sys-config" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.115706 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.115852 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-global-ca" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.125599 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.166721 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxdzw\" (UniqueName: \"kubernetes.io/projected/0383e703-f206-4571-8ca3-be59433df02c-kube-api-access-mxdzw\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.166766 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.166841 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.166932 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.167021 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.167162 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.167229 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.167322 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.167395 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.167415 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.167536 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.167584 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268540 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxdzw\" (UniqueName: \"kubernetes.io/projected/0383e703-f206-4571-8ca3-be59433df02c-kube-api-access-mxdzw\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268592 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268625 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268649 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268681 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268701 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268716 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268740 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268773 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268796 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268817 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268841 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268947 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.269092 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.269418 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.269748 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.269927 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.270144 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.270234 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.270373 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.270840 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.275612 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.276687 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.284915 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxdzw\" (UniqueName: \"kubernetes.io/projected/0383e703-f206-4571-8ca3-be59433df02c-kube-api-access-mxdzw\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.435085 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.640534 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 19 00:24:28 crc kubenswrapper[4745]: I0319 00:24:28.345488 4745 generic.go:334] "Generic (PLEG): container finished" podID="0383e703-f206-4571-8ca3-be59433df02c" containerID="597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82" exitCode=0 Mar 19 00:24:28 crc kubenswrapper[4745]: I0319 00:24:28.345547 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"0383e703-f206-4571-8ca3-be59433df02c","Type":"ContainerDied","Data":"597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82"} Mar 19 00:24:28 crc kubenswrapper[4745]: I0319 00:24:28.345581 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"0383e703-f206-4571-8ca3-be59433df02c","Type":"ContainerStarted","Data":"137b1e8f43fe79dfa16fb03e4c3e074784118642c8f0428bdc642efcff9cad0a"} Mar 19 00:24:29 crc kubenswrapper[4745]: I0319 00:24:29.356279 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"0383e703-f206-4571-8ca3-be59433df02c","Type":"ContainerStarted","Data":"b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75"} Mar 19 00:24:29 crc kubenswrapper[4745]: I0319 00:24:29.390578 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=2.390549957 podStartE2EDuration="2.390549957s" podCreationTimestamp="2026-03-19 00:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:24:29.381637711 +0000 UTC m=+1033.919832852" watchObservedRunningTime="2026-03-19 00:24:29.390549957 +0000 UTC m=+1033.928745088" Mar 19 00:24:35 crc kubenswrapper[4745]: I0319 00:24:35.100728 4745 scope.go:117] "RemoveContainer" containerID="91311d7617172e5175d1b2c1df977704664ce95b1113f4d27a4b6a3392f4c27c" Mar 19 00:24:37 crc kubenswrapper[4745]: I0319 00:24:37.681373 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 19 00:24:37 crc kubenswrapper[4745]: I0319 00:24:37.682932 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="0383e703-f206-4571-8ca3-be59433df02c" containerName="docker-build" containerID="cri-o://b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75" gracePeriod=30 Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.308668 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.310341 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.313099 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-sys-config" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.313170 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-global-ca" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.315409 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-ca" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.341014 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.447825 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.447955 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.447997 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.448800 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.448905 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.448951 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.449059 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.449119 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.449166 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.449189 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.449214 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.449239 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhxg4\" (UniqueName: \"kubernetes.io/projected/7c3be06c-b88a-4749-b788-876b92486d65-kube-api-access-mhxg4\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551285 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551361 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551403 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551454 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551486 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551532 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551539 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551562 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551674 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551706 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhxg4\" (UniqueName: \"kubernetes.io/projected/7c3be06c-b88a-4749-b788-876b92486d65-kube-api-access-mhxg4\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551749 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551783 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551874 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.552003 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.552068 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.552091 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.552312 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.552818 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.552931 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.553023 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.553036 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.558937 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.558958 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.575846 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhxg4\" (UniqueName: \"kubernetes.io/projected/7c3be06c-b88a-4749-b788-876b92486d65-kube-api-access-mhxg4\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.632417 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.865508 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 19 00:24:39 crc kubenswrapper[4745]: W0319 00:24:39.871738 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c3be06c_b88a_4749_b788_876b92486d65.slice/crio-1b68bd195cf6bb7905a5718f20102eac43e6b9f21f64badfb9c94c5272dc5c60 WatchSource:0}: Error finding container 1b68bd195cf6bb7905a5718f20102eac43e6b9f21f64badfb9c94c5272dc5c60: Status 404 returned error can't find the container with id 1b68bd195cf6bb7905a5718f20102eac43e6b9f21f64badfb9c94c5272dc5c60 Mar 19 00:24:40 crc kubenswrapper[4745]: I0319 00:24:40.445920 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"7c3be06c-b88a-4749-b788-876b92486d65","Type":"ContainerStarted","Data":"1b68bd195cf6bb7905a5718f20102eac43e6b9f21f64badfb9c94c5272dc5c60"} Mar 19 00:24:41 crc kubenswrapper[4745]: I0319 00:24:41.453936 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"7c3be06c-b88a-4749-b788-876b92486d65","Type":"ContainerStarted","Data":"1f712e863eed19f54a1076d7614b884aef6cf3e4d828606ee2cf52d7ed11bf86"} Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.190438 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_0383e703-f206-4571-8ca3-be59433df02c/docker-build/0.log" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.191399 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.302390 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-push\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.302467 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-pull\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.303603 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-system-configs\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.303653 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-root\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.303693 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-ca-bundles\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.303723 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-buildworkdir\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.303751 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-proxy-ca-bundles\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.303786 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-run\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.303812 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-build-blob-cache\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.303908 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-buildcachedir\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.303946 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-node-pullsecrets\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.303974 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxdzw\" (UniqueName: \"kubernetes.io/projected/0383e703-f206-4571-8ca3-be59433df02c-kube-api-access-mxdzw\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.304153 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.304525 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.304971 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.305104 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.307681 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.307719 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.308564 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.313161 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.313189 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0383e703-f206-4571-8ca3-be59433df02c-kube-api-access-mxdzw" (OuterVolumeSpecName: "kube-api-access-mxdzw") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "kube-api-access-mxdzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.313643 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.404678 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxdzw\" (UniqueName: \"kubernetes.io/projected/0383e703-f206-4571-8ca3-be59433df02c-kube-api-access-mxdzw\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.405030 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.405131 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.405195 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.405250 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.405345 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.405415 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.405491 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.405549 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.405601 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.447860 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.463283 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_0383e703-f206-4571-8ca3-be59433df02c/docker-build/0.log" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.463859 4745 generic.go:334] "Generic (PLEG): container finished" podID="0383e703-f206-4571-8ca3-be59433df02c" containerID="b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75" exitCode=1 Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.463983 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.464028 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"0383e703-f206-4571-8ca3-be59433df02c","Type":"ContainerDied","Data":"b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75"} Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.464064 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"0383e703-f206-4571-8ca3-be59433df02c","Type":"ContainerDied","Data":"137b1e8f43fe79dfa16fb03e4c3e074784118642c8f0428bdc642efcff9cad0a"} Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.464084 4745 scope.go:117] "RemoveContainer" containerID="b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.507397 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.531912 4745 scope.go:117] "RemoveContainer" containerID="597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.569865 4745 scope.go:117] "RemoveContainer" containerID="b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75" Mar 19 00:24:42 crc kubenswrapper[4745]: E0319 00:24:42.570870 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75\": container with ID starting with b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75 not found: ID does not exist" containerID="b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.570925 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75"} err="failed to get container status \"b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75\": rpc error: code = NotFound desc = could not find container \"b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75\": container with ID starting with b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75 not found: ID does not exist" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.570962 4745 scope.go:117] "RemoveContainer" containerID="597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82" Mar 19 00:24:42 crc kubenswrapper[4745]: E0319 00:24:42.571245 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82\": container with ID starting with 597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82 not found: ID does not exist" containerID="597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.571838 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82"} err="failed to get container status \"597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82\": rpc error: code = NotFound desc = could not find container \"597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82\": container with ID starting with 597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82 not found: ID does not exist" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.698315 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.710648 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.797281 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.803129 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 19 00:24:43 crc kubenswrapper[4745]: I0319 00:24:43.473549 4745 generic.go:334] "Generic (PLEG): container finished" podID="7c3be06c-b88a-4749-b788-876b92486d65" containerID="1f712e863eed19f54a1076d7614b884aef6cf3e4d828606ee2cf52d7ed11bf86" exitCode=0 Mar 19 00:24:43 crc kubenswrapper[4745]: I0319 00:24:43.473605 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"7c3be06c-b88a-4749-b788-876b92486d65","Type":"ContainerDied","Data":"1f712e863eed19f54a1076d7614b884aef6cf3e4d828606ee2cf52d7ed11bf86"} Mar 19 00:24:44 crc kubenswrapper[4745]: I0319 00:24:44.146824 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0383e703-f206-4571-8ca3-be59433df02c" path="/var/lib/kubelet/pods/0383e703-f206-4571-8ca3-be59433df02c/volumes" Mar 19 00:24:44 crc kubenswrapper[4745]: I0319 00:24:44.483426 4745 generic.go:334] "Generic (PLEG): container finished" podID="7c3be06c-b88a-4749-b788-876b92486d65" containerID="8387abf70bd6b56edb79f7d09a543aa55e696cf32abc591019ab21a211e52480" exitCode=0 Mar 19 00:24:44 crc kubenswrapper[4745]: I0319 00:24:44.483835 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"7c3be06c-b88a-4749-b788-876b92486d65","Type":"ContainerDied","Data":"8387abf70bd6b56edb79f7d09a543aa55e696cf32abc591019ab21a211e52480"} Mar 19 00:24:44 crc kubenswrapper[4745]: I0319 00:24:44.520563 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_7c3be06c-b88a-4749-b788-876b92486d65/manage-dockerfile/0.log" Mar 19 00:24:45 crc kubenswrapper[4745]: I0319 00:24:45.510002 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"7c3be06c-b88a-4749-b788-876b92486d65","Type":"ContainerStarted","Data":"2ab2822f8ced8f2081870ab6c5f34700f462d5ced58b7c2cbf1f5d29b9599f13"} Mar 19 00:24:45 crc kubenswrapper[4745]: I0319 00:24:45.556620 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=6.556565649 podStartE2EDuration="6.556565649s" podCreationTimestamp="2026-03-19 00:24:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:24:45.548342895 +0000 UTC m=+1050.086538036" watchObservedRunningTime="2026-03-19 00:24:45.556565649 +0000 UTC m=+1050.094760780" Mar 19 00:24:45 crc kubenswrapper[4745]: I0319 00:24:45.606038 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:24:45 crc kubenswrapper[4745]: I0319 00:24:45.606136 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:24:45 crc kubenswrapper[4745]: I0319 00:24:45.606207 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:24:45 crc kubenswrapper[4745]: I0319 00:24:45.607384 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b7b021cd8b07360e8af6249aac1835e212578d41089c112a8709760bee2deb06"} pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 00:24:45 crc kubenswrapper[4745]: I0319 00:24:45.607464 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" containerID="cri-o://b7b021cd8b07360e8af6249aac1835e212578d41089c112a8709760bee2deb06" gracePeriod=600 Mar 19 00:24:46 crc kubenswrapper[4745]: I0319 00:24:46.520192 4745 generic.go:334] "Generic (PLEG): container finished" podID="400972f4-050f-4f26-b982-ced6f2590c8b" containerID="b7b021cd8b07360e8af6249aac1835e212578d41089c112a8709760bee2deb06" exitCode=0 Mar 19 00:24:46 crc kubenswrapper[4745]: I0319 00:24:46.520265 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerDied","Data":"b7b021cd8b07360e8af6249aac1835e212578d41089c112a8709760bee2deb06"} Mar 19 00:24:46 crc kubenswrapper[4745]: I0319 00:24:46.520724 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"b78af311b5799c4bf0cb1c94c0266d9488b1808ed2c9ef987dfc6da2fa0eaa89"} Mar 19 00:24:46 crc kubenswrapper[4745]: I0319 00:24:46.520780 4745 scope.go:117] "RemoveContainer" containerID="c5fdbb323d46092e1c016d6578086be40b0bfff7d18498b3c00eacd6a8fbf018" Mar 19 00:25:50 crc kubenswrapper[4745]: I0319 00:25:50.990546 4745 generic.go:334] "Generic (PLEG): container finished" podID="7c3be06c-b88a-4749-b788-876b92486d65" containerID="2ab2822f8ced8f2081870ab6c5f34700f462d5ced58b7c2cbf1f5d29b9599f13" exitCode=0 Mar 19 00:25:50 crc kubenswrapper[4745]: I0319 00:25:50.990718 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"7c3be06c-b88a-4749-b788-876b92486d65","Type":"ContainerDied","Data":"2ab2822f8ced8f2081870ab6c5f34700f462d5ced58b7c2cbf1f5d29b9599f13"} Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.250168 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.358200 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-run\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.358239 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-root\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.358295 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-buildworkdir\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.358347 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-proxy-ca-bundles\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.359422 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.358395 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-build-blob-cache\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.359525 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-buildcachedir\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.359593 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.359557 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-system-configs\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.359678 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-pull\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.359716 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-node-pullsecrets\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.360958 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-push\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.360997 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhxg4\" (UniqueName: \"kubernetes.io/projected/7c3be06c-b88a-4749-b788-876b92486d65-kube-api-access-mhxg4\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.359976 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.360032 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.360711 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.361023 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-ca-bundles\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.362724 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.362924 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.362942 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.362955 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.362966 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.362976 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.362987 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.364123 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.367105 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.367230 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c3be06c-b88a-4749-b788-876b92486d65-kube-api-access-mhxg4" (OuterVolumeSpecName: "kube-api-access-mhxg4") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "kube-api-access-mhxg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.368284 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.465075 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.465109 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.465119 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhxg4\" (UniqueName: \"kubernetes.io/projected/7c3be06c-b88a-4749-b788-876b92486d65-kube-api-access-mhxg4\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.465130 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.537107 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.565536 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:53 crc kubenswrapper[4745]: I0319 00:25:53.005446 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"7c3be06c-b88a-4749-b788-876b92486d65","Type":"ContainerDied","Data":"1b68bd195cf6bb7905a5718f20102eac43e6b9f21f64badfb9c94c5272dc5c60"} Mar 19 00:25:53 crc kubenswrapper[4745]: I0319 00:25:53.005503 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b68bd195cf6bb7905a5718f20102eac43e6b9f21f64badfb9c94c5272dc5c60" Mar 19 00:25:53 crc kubenswrapper[4745]: I0319 00:25:53.005528 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:25:54 crc kubenswrapper[4745]: I0319 00:25:54.126851 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:25:54 crc kubenswrapper[4745]: I0319 00:25:54.190280 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.960418 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 19 00:25:56 crc kubenswrapper[4745]: E0319 00:25:56.961131 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3be06c-b88a-4749-b788-876b92486d65" containerName="git-clone" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.961145 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3be06c-b88a-4749-b788-876b92486d65" containerName="git-clone" Mar 19 00:25:56 crc kubenswrapper[4745]: E0319 00:25:56.961162 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0383e703-f206-4571-8ca3-be59433df02c" containerName="manage-dockerfile" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.961168 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0383e703-f206-4571-8ca3-be59433df02c" containerName="manage-dockerfile" Mar 19 00:25:56 crc kubenswrapper[4745]: E0319 00:25:56.961178 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3be06c-b88a-4749-b788-876b92486d65" containerName="docker-build" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.961185 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3be06c-b88a-4749-b788-876b92486d65" containerName="docker-build" Mar 19 00:25:56 crc kubenswrapper[4745]: E0319 00:25:56.961200 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0383e703-f206-4571-8ca3-be59433df02c" containerName="docker-build" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.961206 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0383e703-f206-4571-8ca3-be59433df02c" containerName="docker-build" Mar 19 00:25:56 crc kubenswrapper[4745]: E0319 00:25:56.961217 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3be06c-b88a-4749-b788-876b92486d65" containerName="manage-dockerfile" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.961223 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3be06c-b88a-4749-b788-876b92486d65" containerName="manage-dockerfile" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.961323 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c3be06c-b88a-4749-b788-876b92486d65" containerName="docker-build" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.961339 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="0383e703-f206-4571-8ca3-be59433df02c" containerName="docker-build" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.962041 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.964238 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-sys-config" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.964556 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-ca" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.964714 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-global-ca" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.965084 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.986152 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.071407 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-run\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.071486 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.071520 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.071539 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-system-configs\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.071569 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-root\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.071602 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97p88\" (UniqueName: \"kubernetes.io/projected/6c61d7a4-4470-4cbd-94f5-512619e989f6-kube-api-access-97p88\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.071848 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-push\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.072035 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-pull\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.072069 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.072101 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildcachedir\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.072200 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.072283 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildworkdir\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174108 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-system-configs\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174179 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-root\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174208 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97p88\" (UniqueName: \"kubernetes.io/projected/6c61d7a4-4470-4cbd-94f5-512619e989f6-kube-api-access-97p88\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174245 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-push\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174459 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-pull\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174481 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174506 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildcachedir\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174530 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174558 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildworkdir\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174603 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-run\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174631 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174659 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.175016 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildcachedir\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.175193 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-root\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.175476 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-system-configs\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.175494 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildworkdir\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.175533 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.175950 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-run\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.176307 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.176416 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.176081 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.182732 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-push\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.182799 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-pull\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.192653 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97p88\" (UniqueName: \"kubernetes.io/projected/6c61d7a4-4470-4cbd-94f5-512619e989f6-kube-api-access-97p88\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.280934 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.524245 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 19 00:25:58 crc kubenswrapper[4745]: I0319 00:25:58.037915 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"6c61d7a4-4470-4cbd-94f5-512619e989f6","Type":"ContainerStarted","Data":"7e76a7bf4434b7b734c027315e7c4db3ec75ff14cdec4eb02925ef312367c87c"} Mar 19 00:25:59 crc kubenswrapper[4745]: I0319 00:25:59.046190 4745 generic.go:334] "Generic (PLEG): container finished" podID="6c61d7a4-4470-4cbd-94f5-512619e989f6" containerID="eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226" exitCode=0 Mar 19 00:25:59 crc kubenswrapper[4745]: I0319 00:25:59.046307 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"6c61d7a4-4470-4cbd-94f5-512619e989f6","Type":"ContainerDied","Data":"eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226"} Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.062661 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"6c61d7a4-4470-4cbd-94f5-512619e989f6","Type":"ContainerStarted","Data":"e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18"} Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.088076 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=4.088050331 podStartE2EDuration="4.088050331s" podCreationTimestamp="2026-03-19 00:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:26:00.087026009 +0000 UTC m=+1124.625221170" watchObservedRunningTime="2026-03-19 00:26:00.088050331 +0000 UTC m=+1124.626245462" Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.137057 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564666-lrgz2"] Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.138440 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564666-lrgz2" Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.140847 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.141541 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.143107 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.161471 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564666-lrgz2"] Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.320536 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5jzx\" (UniqueName: \"kubernetes.io/projected/6fad60f0-0471-47eb-af8b-85d8a4a0c52f-kube-api-access-k5jzx\") pod \"auto-csr-approver-29564666-lrgz2\" (UID: \"6fad60f0-0471-47eb-af8b-85d8a4a0c52f\") " pod="openshift-infra/auto-csr-approver-29564666-lrgz2" Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.422634 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5jzx\" (UniqueName: \"kubernetes.io/projected/6fad60f0-0471-47eb-af8b-85d8a4a0c52f-kube-api-access-k5jzx\") pod \"auto-csr-approver-29564666-lrgz2\" (UID: \"6fad60f0-0471-47eb-af8b-85d8a4a0c52f\") " pod="openshift-infra/auto-csr-approver-29564666-lrgz2" Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.451396 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5jzx\" (UniqueName: \"kubernetes.io/projected/6fad60f0-0471-47eb-af8b-85d8a4a0c52f-kube-api-access-k5jzx\") pod \"auto-csr-approver-29564666-lrgz2\" (UID: \"6fad60f0-0471-47eb-af8b-85d8a4a0c52f\") " pod="openshift-infra/auto-csr-approver-29564666-lrgz2" Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.493964 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564666-lrgz2" Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.716289 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564666-lrgz2"] Mar 19 00:26:01 crc kubenswrapper[4745]: I0319 00:26:01.073600 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564666-lrgz2" event={"ID":"6fad60f0-0471-47eb-af8b-85d8a4a0c52f","Type":"ContainerStarted","Data":"d97390ec66de1b560d581631acdabd76a6dafd4e1f3aabdca27e817c5b8cc973"} Mar 19 00:26:02 crc kubenswrapper[4745]: I0319 00:26:02.091411 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564666-lrgz2" event={"ID":"6fad60f0-0471-47eb-af8b-85d8a4a0c52f","Type":"ContainerStarted","Data":"6373326992b421fd82709a30cf5b66d6c20b4b21e2598084ed73e4aa5185678e"} Mar 19 00:26:03 crc kubenswrapper[4745]: I0319 00:26:03.099867 4745 generic.go:334] "Generic (PLEG): container finished" podID="6fad60f0-0471-47eb-af8b-85d8a4a0c52f" containerID="6373326992b421fd82709a30cf5b66d6c20b4b21e2598084ed73e4aa5185678e" exitCode=0 Mar 19 00:26:03 crc kubenswrapper[4745]: I0319 00:26:03.099954 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564666-lrgz2" event={"ID":"6fad60f0-0471-47eb-af8b-85d8a4a0c52f","Type":"ContainerDied","Data":"6373326992b421fd82709a30cf5b66d6c20b4b21e2598084ed73e4aa5185678e"} Mar 19 00:26:04 crc kubenswrapper[4745]: I0319 00:26:04.345331 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564666-lrgz2" Mar 19 00:26:04 crc kubenswrapper[4745]: I0319 00:26:04.486904 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5jzx\" (UniqueName: \"kubernetes.io/projected/6fad60f0-0471-47eb-af8b-85d8a4a0c52f-kube-api-access-k5jzx\") pod \"6fad60f0-0471-47eb-af8b-85d8a4a0c52f\" (UID: \"6fad60f0-0471-47eb-af8b-85d8a4a0c52f\") " Mar 19 00:26:04 crc kubenswrapper[4745]: I0319 00:26:04.494152 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fad60f0-0471-47eb-af8b-85d8a4a0c52f-kube-api-access-k5jzx" (OuterVolumeSpecName: "kube-api-access-k5jzx") pod "6fad60f0-0471-47eb-af8b-85d8a4a0c52f" (UID: "6fad60f0-0471-47eb-af8b-85d8a4a0c52f"). InnerVolumeSpecName "kube-api-access-k5jzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:26:04 crc kubenswrapper[4745]: I0319 00:26:04.588141 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5jzx\" (UniqueName: \"kubernetes.io/projected/6fad60f0-0471-47eb-af8b-85d8a4a0c52f-kube-api-access-k5jzx\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:05 crc kubenswrapper[4745]: I0319 00:26:05.113032 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564666-lrgz2" event={"ID":"6fad60f0-0471-47eb-af8b-85d8a4a0c52f","Type":"ContainerDied","Data":"d97390ec66de1b560d581631acdabd76a6dafd4e1f3aabdca27e817c5b8cc973"} Mar 19 00:26:05 crc kubenswrapper[4745]: I0319 00:26:05.113092 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d97390ec66de1b560d581631acdabd76a6dafd4e1f3aabdca27e817c5b8cc973" Mar 19 00:26:05 crc kubenswrapper[4745]: I0319 00:26:05.113111 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564666-lrgz2" Mar 19 00:26:05 crc kubenswrapper[4745]: I0319 00:26:05.169608 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564660-t5gfq"] Mar 19 00:26:05 crc kubenswrapper[4745]: I0319 00:26:05.174676 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564660-t5gfq"] Mar 19 00:26:06 crc kubenswrapper[4745]: I0319 00:26:06.147142 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="559d4ca4-399c-4504-8358-69d88bfdaf3a" path="/var/lib/kubelet/pods/559d4ca4-399c-4504-8358-69d88bfdaf3a/volumes" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.300797 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.301665 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="6c61d7a4-4470-4cbd-94f5-512619e989f6" containerName="docker-build" containerID="cri-o://e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18" gracePeriod=30 Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.701721 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_6c61d7a4-4470-4cbd-94f5-512619e989f6/docker-build/0.log" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.703125 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.835811 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-node-pullsecrets\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.835876 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-blob-cache\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.835908 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-run\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.835959 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97p88\" (UniqueName: \"kubernetes.io/projected/6c61d7a4-4470-4cbd-94f5-512619e989f6-kube-api-access-97p88\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.835987 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-root\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836034 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildworkdir\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836068 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-push\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836023 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836094 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-pull\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836235 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-proxy-ca-bundles\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836264 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildcachedir\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836293 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-ca-bundles\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836343 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-system-configs\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836399 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836868 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836914 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.838830 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.839158 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.839619 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.840243 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.840583 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.843454 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.843627 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c61d7a4-4470-4cbd-94f5-512619e989f6-kube-api-access-97p88" (OuterVolumeSpecName: "kube-api-access-97p88") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "kube-api-access-97p88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.843830 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.937793 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.937840 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.937863 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.937897 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.937911 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.937920 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.937929 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.937940 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97p88\" (UniqueName: \"kubernetes.io/projected/6c61d7a4-4470-4cbd-94f5-512619e989f6-kube-api-access-97p88\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.940352 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.952786 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.039275 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.039355 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.135352 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_6c61d7a4-4470-4cbd-94f5-512619e989f6/docker-build/0.log" Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.135966 4745 generic.go:334] "Generic (PLEG): container finished" podID="6c61d7a4-4470-4cbd-94f5-512619e989f6" containerID="e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18" exitCode=1 Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.136026 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"6c61d7a4-4470-4cbd-94f5-512619e989f6","Type":"ContainerDied","Data":"e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18"} Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.136079 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.136137 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"6c61d7a4-4470-4cbd-94f5-512619e989f6","Type":"ContainerDied","Data":"7e76a7bf4434b7b734c027315e7c4db3ec75ff14cdec4eb02925ef312367c87c"} Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.136166 4745 scope.go:117] "RemoveContainer" containerID="e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18" Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.171261 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.179096 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.186510 4745 scope.go:117] "RemoveContainer" containerID="eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226" Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.205416 4745 scope.go:117] "RemoveContainer" containerID="e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18" Mar 19 00:26:08 crc kubenswrapper[4745]: E0319 00:26:08.205944 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18\": container with ID starting with e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18 not found: ID does not exist" containerID="e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18" Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.206010 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18"} err="failed to get container status \"e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18\": rpc error: code = NotFound desc = could not find container \"e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18\": container with ID starting with e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18 not found: ID does not exist" Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.206049 4745 scope.go:117] "RemoveContainer" containerID="eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226" Mar 19 00:26:08 crc kubenswrapper[4745]: E0319 00:26:08.206961 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226\": container with ID starting with eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226 not found: ID does not exist" containerID="eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226" Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.207030 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226"} err="failed to get container status \"eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226\": rpc error: code = NotFound desc = could not find container \"eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226\": container with ID starting with eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226 not found: ID does not exist" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.021517 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 19 00:26:09 crc kubenswrapper[4745]: E0319 00:26:09.021852 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c61d7a4-4470-4cbd-94f5-512619e989f6" containerName="manage-dockerfile" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.021870 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c61d7a4-4470-4cbd-94f5-512619e989f6" containerName="manage-dockerfile" Mar 19 00:26:09 crc kubenswrapper[4745]: E0319 00:26:09.021923 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c61d7a4-4470-4cbd-94f5-512619e989f6" containerName="docker-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.021931 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c61d7a4-4470-4cbd-94f5-512619e989f6" containerName="docker-build" Mar 19 00:26:09 crc kubenswrapper[4745]: E0319 00:26:09.021941 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fad60f0-0471-47eb-af8b-85d8a4a0c52f" containerName="oc" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.021950 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fad60f0-0471-47eb-af8b-85d8a4a0c52f" containerName="oc" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.022072 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c61d7a4-4470-4cbd-94f5-512619e989f6" containerName="docker-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.022090 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fad60f0-0471-47eb-af8b-85d8a4a0c52f" containerName="oc" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.023172 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.026104 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-ca" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.026344 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-sys-config" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.026637 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-global-ca" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.030335 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.049183 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053088 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-system-configs\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053162 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-run\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053297 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053348 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053390 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7lvf\" (UniqueName: \"kubernetes.io/projected/534b93f2-ab59-4958-9374-29c114fab497-kube-api-access-l7lvf\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053443 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-root\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053567 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-buildworkdir\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053660 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-push\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053807 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-pull\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053840 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053898 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053983 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-buildcachedir\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.156657 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.156752 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7lvf\" (UniqueName: \"kubernetes.io/projected/534b93f2-ab59-4958-9374-29c114fab497-kube-api-access-l7lvf\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.156801 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-root\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157032 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-buildworkdir\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157071 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-push\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157156 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-pull\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157180 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157214 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157217 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157248 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-buildcachedir\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157275 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-system-configs\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157306 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-run\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157385 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157455 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-root\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157486 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157686 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-buildworkdir\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157937 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157986 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-system-configs\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157992 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-buildcachedir\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.158151 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-run\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.158943 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.162435 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-pull\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.173350 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-push\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.174108 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7lvf\" (UniqueName: \"kubernetes.io/projected/534b93f2-ab59-4958-9374-29c114fab497-kube-api-access-l7lvf\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.338956 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.561288 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 19 00:26:10 crc kubenswrapper[4745]: I0319 00:26:10.146424 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c61d7a4-4470-4cbd-94f5-512619e989f6" path="/var/lib/kubelet/pods/6c61d7a4-4470-4cbd-94f5-512619e989f6/volumes" Mar 19 00:26:10 crc kubenswrapper[4745]: I0319 00:26:10.157580 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"534b93f2-ab59-4958-9374-29c114fab497","Type":"ContainerStarted","Data":"6af6d11839f61cc5eb6c4ada548e09fae08633a6e79b3849dd1445a55e9baf3d"} Mar 19 00:26:10 crc kubenswrapper[4745]: I0319 00:26:10.157630 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"534b93f2-ab59-4958-9374-29c114fab497","Type":"ContainerStarted","Data":"7aed128a4133bb08f4b1c5a4857236ca0965b3bd6da79c7f58fb80cc2d5df0f9"} Mar 19 00:26:11 crc kubenswrapper[4745]: I0319 00:26:11.165923 4745 generic.go:334] "Generic (PLEG): container finished" podID="534b93f2-ab59-4958-9374-29c114fab497" containerID="6af6d11839f61cc5eb6c4ada548e09fae08633a6e79b3849dd1445a55e9baf3d" exitCode=0 Mar 19 00:26:11 crc kubenswrapper[4745]: I0319 00:26:11.166002 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"534b93f2-ab59-4958-9374-29c114fab497","Type":"ContainerDied","Data":"6af6d11839f61cc5eb6c4ada548e09fae08633a6e79b3849dd1445a55e9baf3d"} Mar 19 00:26:12 crc kubenswrapper[4745]: I0319 00:26:12.177865 4745 generic.go:334] "Generic (PLEG): container finished" podID="534b93f2-ab59-4958-9374-29c114fab497" containerID="409ec20be66c2df87bfc00e1e2571b2f14c5260bc018c8103a5d083b4bbb413a" exitCode=0 Mar 19 00:26:12 crc kubenswrapper[4745]: I0319 00:26:12.178259 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"534b93f2-ab59-4958-9374-29c114fab497","Type":"ContainerDied","Data":"409ec20be66c2df87bfc00e1e2571b2f14c5260bc018c8103a5d083b4bbb413a"} Mar 19 00:26:12 crc kubenswrapper[4745]: I0319 00:26:12.219645 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-2-build_534b93f2-ab59-4958-9374-29c114fab497/manage-dockerfile/0.log" Mar 19 00:26:13 crc kubenswrapper[4745]: I0319 00:26:13.188655 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"534b93f2-ab59-4958-9374-29c114fab497","Type":"ContainerStarted","Data":"364a672dbba13054a5bf498f730b5775c59edbc1f42527c6d8984982a4ab68c2"} Mar 19 00:26:13 crc kubenswrapper[4745]: I0319 00:26:13.218439 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=5.218420379 podStartE2EDuration="5.218420379s" podCreationTimestamp="2026-03-19 00:26:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:26:13.214434236 +0000 UTC m=+1137.752629387" watchObservedRunningTime="2026-03-19 00:26:13.218420379 +0000 UTC m=+1137.756615510" Mar 19 00:26:35 crc kubenswrapper[4745]: I0319 00:26:35.218925 4745 scope.go:117] "RemoveContainer" containerID="1663b5c8bcd4ae3a664653728fe6c21020e126b8db8f2cf94f1cfba9c6c7bbc2" Mar 19 00:26:45 crc kubenswrapper[4745]: I0319 00:26:45.606005 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:26:45 crc kubenswrapper[4745]: I0319 00:26:45.607062 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:27:15 crc kubenswrapper[4745]: I0319 00:27:15.606476 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:27:15 crc kubenswrapper[4745]: I0319 00:27:15.607147 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:27:45 crc kubenswrapper[4745]: I0319 00:27:45.606013 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:27:45 crc kubenswrapper[4745]: I0319 00:27:45.606984 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:27:45 crc kubenswrapper[4745]: I0319 00:27:45.607070 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:27:45 crc kubenswrapper[4745]: I0319 00:27:45.607964 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b78af311b5799c4bf0cb1c94c0266d9488b1808ed2c9ef987dfc6da2fa0eaa89"} pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 00:27:45 crc kubenswrapper[4745]: I0319 00:27:45.608063 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" containerID="cri-o://b78af311b5799c4bf0cb1c94c0266d9488b1808ed2c9ef987dfc6da2fa0eaa89" gracePeriod=600 Mar 19 00:27:45 crc kubenswrapper[4745]: I0319 00:27:45.819280 4745 generic.go:334] "Generic (PLEG): container finished" podID="400972f4-050f-4f26-b982-ced6f2590c8b" containerID="b78af311b5799c4bf0cb1c94c0266d9488b1808ed2c9ef987dfc6da2fa0eaa89" exitCode=0 Mar 19 00:27:45 crc kubenswrapper[4745]: I0319 00:27:45.819382 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerDied","Data":"b78af311b5799c4bf0cb1c94c0266d9488b1808ed2c9ef987dfc6da2fa0eaa89"} Mar 19 00:27:45 crc kubenswrapper[4745]: I0319 00:27:45.819764 4745 scope.go:117] "RemoveContainer" containerID="b7b021cd8b07360e8af6249aac1835e212578d41089c112a8709760bee2deb06" Mar 19 00:27:46 crc kubenswrapper[4745]: I0319 00:27:46.830524 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"596b4f8d9ce25b5331069b1483120c23162574341fd13a90efeccafcfe8087f5"} Mar 19 00:28:00 crc kubenswrapper[4745]: I0319 00:28:00.146819 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564668-66fs6"] Mar 19 00:28:00 crc kubenswrapper[4745]: I0319 00:28:00.148763 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564668-66fs6" Mar 19 00:28:00 crc kubenswrapper[4745]: I0319 00:28:00.153032 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:28:00 crc kubenswrapper[4745]: I0319 00:28:00.153309 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:28:00 crc kubenswrapper[4745]: I0319 00:28:00.154112 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:28:00 crc kubenswrapper[4745]: I0319 00:28:00.154413 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564668-66fs6"] Mar 19 00:28:00 crc kubenswrapper[4745]: I0319 00:28:00.232510 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftw9t\" (UniqueName: \"kubernetes.io/projected/c9ad0116-35eb-40db-8d57-4501affdf59c-kube-api-access-ftw9t\") pod \"auto-csr-approver-29564668-66fs6\" (UID: \"c9ad0116-35eb-40db-8d57-4501affdf59c\") " pod="openshift-infra/auto-csr-approver-29564668-66fs6" Mar 19 00:28:00 crc kubenswrapper[4745]: I0319 00:28:00.334390 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftw9t\" (UniqueName: \"kubernetes.io/projected/c9ad0116-35eb-40db-8d57-4501affdf59c-kube-api-access-ftw9t\") pod \"auto-csr-approver-29564668-66fs6\" (UID: \"c9ad0116-35eb-40db-8d57-4501affdf59c\") " pod="openshift-infra/auto-csr-approver-29564668-66fs6" Mar 19 00:28:00 crc kubenswrapper[4745]: I0319 00:28:00.355819 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftw9t\" (UniqueName: \"kubernetes.io/projected/c9ad0116-35eb-40db-8d57-4501affdf59c-kube-api-access-ftw9t\") pod \"auto-csr-approver-29564668-66fs6\" (UID: \"c9ad0116-35eb-40db-8d57-4501affdf59c\") " pod="openshift-infra/auto-csr-approver-29564668-66fs6" Mar 19 00:28:00 crc kubenswrapper[4745]: I0319 00:28:00.472729 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564668-66fs6" Mar 19 00:28:01 crc kubenswrapper[4745]: I0319 00:28:01.035866 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564668-66fs6"] Mar 19 00:28:01 crc kubenswrapper[4745]: W0319 00:28:01.038435 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9ad0116_35eb_40db_8d57_4501affdf59c.slice/crio-c2272424c5ada381be349899a76801cce4e35bded259fe03ed1169b562e94702 WatchSource:0}: Error finding container c2272424c5ada381be349899a76801cce4e35bded259fe03ed1169b562e94702: Status 404 returned error can't find the container with id c2272424c5ada381be349899a76801cce4e35bded259fe03ed1169b562e94702 Mar 19 00:28:01 crc kubenswrapper[4745]: I0319 00:28:01.041738 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 00:28:01 crc kubenswrapper[4745]: I0319 00:28:01.938393 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564668-66fs6" event={"ID":"c9ad0116-35eb-40db-8d57-4501affdf59c","Type":"ContainerStarted","Data":"c2272424c5ada381be349899a76801cce4e35bded259fe03ed1169b562e94702"} Mar 19 00:28:02 crc kubenswrapper[4745]: I0319 00:28:02.947986 4745 generic.go:334] "Generic (PLEG): container finished" podID="c9ad0116-35eb-40db-8d57-4501affdf59c" containerID="08e036dc6c9a44bd6fdc8a12f3525fb5e0bf5c4fdd30613e6e3e3b5a2939ce17" exitCode=0 Mar 19 00:28:02 crc kubenswrapper[4745]: I0319 00:28:02.948122 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564668-66fs6" event={"ID":"c9ad0116-35eb-40db-8d57-4501affdf59c","Type":"ContainerDied","Data":"08e036dc6c9a44bd6fdc8a12f3525fb5e0bf5c4fdd30613e6e3e3b5a2939ce17"} Mar 19 00:28:04 crc kubenswrapper[4745]: I0319 00:28:04.218728 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564668-66fs6" Mar 19 00:28:04 crc kubenswrapper[4745]: I0319 00:28:04.307672 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftw9t\" (UniqueName: \"kubernetes.io/projected/c9ad0116-35eb-40db-8d57-4501affdf59c-kube-api-access-ftw9t\") pod \"c9ad0116-35eb-40db-8d57-4501affdf59c\" (UID: \"c9ad0116-35eb-40db-8d57-4501affdf59c\") " Mar 19 00:28:04 crc kubenswrapper[4745]: I0319 00:28:04.316005 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9ad0116-35eb-40db-8d57-4501affdf59c-kube-api-access-ftw9t" (OuterVolumeSpecName: "kube-api-access-ftw9t") pod "c9ad0116-35eb-40db-8d57-4501affdf59c" (UID: "c9ad0116-35eb-40db-8d57-4501affdf59c"). InnerVolumeSpecName "kube-api-access-ftw9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:28:04 crc kubenswrapper[4745]: I0319 00:28:04.409618 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftw9t\" (UniqueName: \"kubernetes.io/projected/c9ad0116-35eb-40db-8d57-4501affdf59c-kube-api-access-ftw9t\") on node \"crc\" DevicePath \"\"" Mar 19 00:28:04 crc kubenswrapper[4745]: I0319 00:28:04.961803 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564668-66fs6" event={"ID":"c9ad0116-35eb-40db-8d57-4501affdf59c","Type":"ContainerDied","Data":"c2272424c5ada381be349899a76801cce4e35bded259fe03ed1169b562e94702"} Mar 19 00:28:04 crc kubenswrapper[4745]: I0319 00:28:04.961854 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2272424c5ada381be349899a76801cce4e35bded259fe03ed1169b562e94702" Mar 19 00:28:04 crc kubenswrapper[4745]: I0319 00:28:04.962512 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564668-66fs6" Mar 19 00:28:05 crc kubenswrapper[4745]: I0319 00:28:05.384266 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564662-znhfd"] Mar 19 00:28:05 crc kubenswrapper[4745]: I0319 00:28:05.390028 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564662-znhfd"] Mar 19 00:28:06 crc kubenswrapper[4745]: I0319 00:28:06.145363 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deed3a0c-ada3-41b5-895b-8acc45926539" path="/var/lib/kubelet/pods/deed3a0c-ada3-41b5-895b-8acc45926539/volumes" Mar 19 00:28:35 crc kubenswrapper[4745]: I0319 00:28:35.302355 4745 scope.go:117] "RemoveContainer" containerID="7d29ab0663977a94ba5c0f15b3cbd0ce7ec172f2fc28bc0ca2d89b44013b1e84" Mar 19 00:29:40 crc kubenswrapper[4745]: I0319 00:29:40.839079 4745 generic.go:334] "Generic (PLEG): container finished" podID="534b93f2-ab59-4958-9374-29c114fab497" containerID="364a672dbba13054a5bf498f730b5775c59edbc1f42527c6d8984982a4ab68c2" exitCode=0 Mar 19 00:29:40 crc kubenswrapper[4745]: I0319 00:29:40.839164 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"534b93f2-ab59-4958-9374-29c114fab497","Type":"ContainerDied","Data":"364a672dbba13054a5bf498f730b5775c59edbc1f42527c6d8984982a4ab68c2"} Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.080454 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172286 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-proxy-ca-bundles\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172339 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-root\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172376 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-build-blob-cache\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172412 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-buildworkdir\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172436 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-system-configs\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172477 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-pull\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172504 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-push\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172531 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-buildcachedir\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172600 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-run\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172621 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7lvf\" (UniqueName: \"kubernetes.io/projected/534b93f2-ab59-4958-9374-29c114fab497-kube-api-access-l7lvf\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172653 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-ca-bundles\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172692 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-node-pullsecrets\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172739 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.173005 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.173040 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.173521 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.174514 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.175154 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.175166 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.180075 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/534b93f2-ab59-4958-9374-29c114fab497-kube-api-access-l7lvf" (OuterVolumeSpecName: "kube-api-access-l7lvf") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "kube-api-access-l7lvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.180078 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.180721 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.185357 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.274637 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.275090 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.275176 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.275242 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.275303 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.275359 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.275433 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7lvf\" (UniqueName: \"kubernetes.io/projected/534b93f2-ab59-4958-9374-29c114fab497-kube-api-access-l7lvf\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.275489 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.275541 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.589025 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.682614 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.857692 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"534b93f2-ab59-4958-9374-29c114fab497","Type":"ContainerDied","Data":"7aed128a4133bb08f4b1c5a4857236ca0965b3bd6da79c7f58fb80cc2d5df0f9"} Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.857753 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7aed128a4133bb08f4b1c5a4857236ca0965b3bd6da79c7f58fb80cc2d5df0f9" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.857848 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 19 00:29:44 crc kubenswrapper[4745]: I0319 00:29:44.825305 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:29:44 crc kubenswrapper[4745]: I0319 00:29:44.919677 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:45 crc kubenswrapper[4745]: I0319 00:29:45.606219 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:29:45 crc kubenswrapper[4745]: I0319 00:29:45.606693 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.376684 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 19 00:29:47 crc kubenswrapper[4745]: E0319 00:29:47.377024 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534b93f2-ab59-4958-9374-29c114fab497" containerName="git-clone" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.377041 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="534b93f2-ab59-4958-9374-29c114fab497" containerName="git-clone" Mar 19 00:29:47 crc kubenswrapper[4745]: E0319 00:29:47.377060 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534b93f2-ab59-4958-9374-29c114fab497" containerName="manage-dockerfile" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.377067 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="534b93f2-ab59-4958-9374-29c114fab497" containerName="manage-dockerfile" Mar 19 00:29:47 crc kubenswrapper[4745]: E0319 00:29:47.377088 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ad0116-35eb-40db-8d57-4501affdf59c" containerName="oc" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.377098 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ad0116-35eb-40db-8d57-4501affdf59c" containerName="oc" Mar 19 00:29:47 crc kubenswrapper[4745]: E0319 00:29:47.377106 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534b93f2-ab59-4958-9374-29c114fab497" containerName="docker-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.377112 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="534b93f2-ab59-4958-9374-29c114fab497" containerName="docker-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.377213 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ad0116-35eb-40db-8d57-4501affdf59c" containerName="oc" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.377224 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="534b93f2-ab59-4958-9374-29c114fab497" containerName="docker-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.377963 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.380392 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.380420 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-ca" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.381264 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-global-ca" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.382181 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-sys-config" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.401824 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435507 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435576 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435597 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-pull\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435614 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435635 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435650 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435667 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435685 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435702 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435900 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435978 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6fbq\" (UniqueName: \"kubernetes.io/projected/107067a8-8942-4ede-9614-121991e06616-kube-api-access-p6fbq\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.436031 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-push\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.536800 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.536853 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.536871 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.536944 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.536975 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.537013 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.537034 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6fbq\" (UniqueName: \"kubernetes.io/projected/107067a8-8942-4ede-9614-121991e06616-kube-api-access-p6fbq\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.537055 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-push\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.537099 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.537128 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.537148 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-pull\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.537171 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.537234 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.537570 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.537970 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.538158 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.540062 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.540588 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.540973 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.541377 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.542120 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.547658 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-push\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.548067 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-pull\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.557091 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6fbq\" (UniqueName: \"kubernetes.io/projected/107067a8-8942-4ede-9614-121991e06616-kube-api-access-p6fbq\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.694381 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.888161 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.931433 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"107067a8-8942-4ede-9614-121991e06616","Type":"ContainerStarted","Data":"4c71a05e40b649b82c93fa62486b6094516be105dfaccb79ed3cf959fdf04635"} Mar 19 00:29:48 crc kubenswrapper[4745]: I0319 00:29:48.942615 4745 generic.go:334] "Generic (PLEG): container finished" podID="107067a8-8942-4ede-9614-121991e06616" containerID="649af08f705b85b72e2b308ff29da2e7d5edce2a304ca0c563098fa2b731a46b" exitCode=0 Mar 19 00:29:48 crc kubenswrapper[4745]: I0319 00:29:48.942692 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"107067a8-8942-4ede-9614-121991e06616","Type":"ContainerDied","Data":"649af08f705b85b72e2b308ff29da2e7d5edce2a304ca0c563098fa2b731a46b"} Mar 19 00:29:49 crc kubenswrapper[4745]: I0319 00:29:49.957333 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"107067a8-8942-4ede-9614-121991e06616","Type":"ContainerStarted","Data":"ce8a63b6903edaf1dfec62801e62f011b41ac609121cec118da8bcbd296b697b"} Mar 19 00:29:49 crc kubenswrapper[4745]: I0319 00:29:49.992023 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=2.991998218 podStartE2EDuration="2.991998218s" podCreationTimestamp="2026-03-19 00:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:29:49.991838523 +0000 UTC m=+1354.530033664" watchObservedRunningTime="2026-03-19 00:29:49.991998218 +0000 UTC m=+1354.530193349" Mar 19 00:29:56 crc kubenswrapper[4745]: I0319 00:29:56.002942 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_107067a8-8942-4ede-9614-121991e06616/docker-build/0.log" Mar 19 00:29:56 crc kubenswrapper[4745]: I0319 00:29:56.003993 4745 generic.go:334] "Generic (PLEG): container finished" podID="107067a8-8942-4ede-9614-121991e06616" containerID="ce8a63b6903edaf1dfec62801e62f011b41ac609121cec118da8bcbd296b697b" exitCode=1 Mar 19 00:29:56 crc kubenswrapper[4745]: I0319 00:29:56.004048 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"107067a8-8942-4ede-9614-121991e06616","Type":"ContainerDied","Data":"ce8a63b6903edaf1dfec62801e62f011b41ac609121cec118da8bcbd296b697b"} Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.263471 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_107067a8-8942-4ede-9614-121991e06616/docker-build/0.log" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.264554 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.392621 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-buildworkdir\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.392721 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-run\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.392757 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-build-blob-cache\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.392835 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-buildcachedir\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.392935 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6fbq\" (UniqueName: \"kubernetes.io/projected/107067a8-8942-4ede-9614-121991e06616-kube-api-access-p6fbq\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.392965 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-root\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.392985 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-system-configs\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.393026 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-ca-bundles\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.393036 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.393071 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-pull\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.393201 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-push\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.393287 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-proxy-ca-bundles\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.393325 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-node-pullsecrets\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.393796 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.393831 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.393959 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.394538 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.394573 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.395039 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.395275 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.401065 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.401092 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.401108 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/107067a8-8942-4ede-9614-121991e06616-kube-api-access-p6fbq" (OuterVolumeSpecName: "kube-api-access-p6fbq") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "kube-api-access-p6fbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.485471 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.495904 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.496332 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.496343 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.496352 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.496361 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.496369 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6fbq\" (UniqueName: \"kubernetes.io/projected/107067a8-8942-4ede-9614-121991e06616-kube-api-access-p6fbq\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.496377 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.496385 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.496393 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.496404 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.760864 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.801230 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.833561 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.839203 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 19 00:29:58 crc kubenswrapper[4745]: I0319 00:29:58.019493 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_107067a8-8942-4ede-9614-121991e06616/docker-build/0.log" Mar 19 00:29:58 crc kubenswrapper[4745]: I0319 00:29:58.019964 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c71a05e40b649b82c93fa62486b6094516be105dfaccb79ed3cf959fdf04635" Mar 19 00:29:58 crc kubenswrapper[4745]: I0319 00:29:58.020079 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:58 crc kubenswrapper[4745]: I0319 00:29:58.147024 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="107067a8-8942-4ede-9614-121991e06616" path="/var/lib/kubelet/pods/107067a8-8942-4ede-9614-121991e06616/volumes" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.529819 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 19 00:29:59 crc kubenswrapper[4745]: E0319 00:29:59.530144 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107067a8-8942-4ede-9614-121991e06616" containerName="manage-dockerfile" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.530157 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="107067a8-8942-4ede-9614-121991e06616" containerName="manage-dockerfile" Mar 19 00:29:59 crc kubenswrapper[4745]: E0319 00:29:59.530179 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107067a8-8942-4ede-9614-121991e06616" containerName="docker-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.530186 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="107067a8-8942-4ede-9614-121991e06616" containerName="docker-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.530307 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="107067a8-8942-4ede-9614-121991e06616" containerName="docker-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.531253 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.533753 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-global-ca" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.534144 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.534227 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-sys-config" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.534188 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-ca" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.550072 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.629674 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630283 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630306 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bj7z\" (UniqueName: \"kubernetes.io/projected/750c31ab-bd58-4423-bb43-45dccd385cab-kube-api-access-2bj7z\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630363 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630390 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-push\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630508 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630569 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630658 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630700 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630770 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-pull\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630799 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630874 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733008 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733095 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733125 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bj7z\" (UniqueName: \"kubernetes.io/projected/750c31ab-bd58-4423-bb43-45dccd385cab-kube-api-access-2bj7z\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733143 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733165 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733189 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-push\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733209 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733228 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733261 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733286 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733311 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-pull\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733337 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733739 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733771 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733732 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.734009 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.734117 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.734233 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.734257 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.734250 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.735522 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.748179 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-push\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.751507 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bj7z\" (UniqueName: \"kubernetes.io/projected/750c31ab-bd58-4423-bb43-45dccd385cab-kube-api-access-2bj7z\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.754350 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-pull\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.847398 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.152013 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr"] Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.152959 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.155917 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.155982 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.158627 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr"] Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.239093 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564670-8sw74"] Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.240235 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564670-8sw74" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.243870 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-config-volume\") pod \"collect-profiles-29564670-ps2vr\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.244010 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmp5d\" (UniqueName: \"kubernetes.io/projected/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-kube-api-access-pmp5d\") pod \"collect-profiles-29564670-ps2vr\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.244069 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-secret-volume\") pod \"collect-profiles-29564670-ps2vr\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.244103 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzdff\" (UniqueName: \"kubernetes.io/projected/9a566f97-13b9-4fde-868a-f55bd82a1af6-kube-api-access-tzdff\") pod \"auto-csr-approver-29564670-8sw74\" (UID: \"9a566f97-13b9-4fde-868a-f55bd82a1af6\") " pod="openshift-infra/auto-csr-approver-29564670-8sw74" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.244468 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.244711 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.252131 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.257761 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564670-8sw74"] Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.292793 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.345163 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmp5d\" (UniqueName: \"kubernetes.io/projected/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-kube-api-access-pmp5d\") pod \"collect-profiles-29564670-ps2vr\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.345217 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-config-volume\") pod \"collect-profiles-29564670-ps2vr\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.345274 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-secret-volume\") pod \"collect-profiles-29564670-ps2vr\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.345310 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzdff\" (UniqueName: \"kubernetes.io/projected/9a566f97-13b9-4fde-868a-f55bd82a1af6-kube-api-access-tzdff\") pod \"auto-csr-approver-29564670-8sw74\" (UID: \"9a566f97-13b9-4fde-868a-f55bd82a1af6\") " pod="openshift-infra/auto-csr-approver-29564670-8sw74" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.346122 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-config-volume\") pod \"collect-profiles-29564670-ps2vr\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.355391 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-secret-volume\") pod \"collect-profiles-29564670-ps2vr\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.364145 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzdff\" (UniqueName: \"kubernetes.io/projected/9a566f97-13b9-4fde-868a-f55bd82a1af6-kube-api-access-tzdff\") pod \"auto-csr-approver-29564670-8sw74\" (UID: \"9a566f97-13b9-4fde-868a-f55bd82a1af6\") " pod="openshift-infra/auto-csr-approver-29564670-8sw74" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.364271 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmp5d\" (UniqueName: \"kubernetes.io/projected/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-kube-api-access-pmp5d\") pod \"collect-profiles-29564670-ps2vr\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.479095 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.571195 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564670-8sw74" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.691749 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr"] Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.800484 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564670-8sw74"] Mar 19 00:30:00 crc kubenswrapper[4745]: W0319 00:30:00.807977 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a566f97_13b9_4fde_868a_f55bd82a1af6.slice/crio-94124bc26fea9877d2754a86a07c8c0716612e1d40c4900162d26957454a2995 WatchSource:0}: Error finding container 94124bc26fea9877d2754a86a07c8c0716612e1d40c4900162d26957454a2995: Status 404 returned error can't find the container with id 94124bc26fea9877d2754a86a07c8c0716612e1d40c4900162d26957454a2995 Mar 19 00:30:01 crc kubenswrapper[4745]: I0319 00:30:01.066296 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564670-8sw74" event={"ID":"9a566f97-13b9-4fde-868a-f55bd82a1af6","Type":"ContainerStarted","Data":"94124bc26fea9877d2754a86a07c8c0716612e1d40c4900162d26957454a2995"} Mar 19 00:30:01 crc kubenswrapper[4745]: I0319 00:30:01.068896 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" event={"ID":"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848","Type":"ContainerStarted","Data":"f162a6529bd91b5afff5c3c3d61fb698e5e267339659205f4beb33078fc998f2"} Mar 19 00:30:01 crc kubenswrapper[4745]: I0319 00:30:01.068945 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" event={"ID":"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848","Type":"ContainerStarted","Data":"a4a63506cc075bbeba17c49e6c84f7af721e430f03dba25dbfdcb03fbd700246"} Mar 19 00:30:01 crc kubenswrapper[4745]: I0319 00:30:01.071088 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"750c31ab-bd58-4423-bb43-45dccd385cab","Type":"ContainerStarted","Data":"76b872a9c674840046cfe8380bd40ec0bac4b1772b00ad0e5157a9592cf2c428"} Mar 19 00:30:01 crc kubenswrapper[4745]: I0319 00:30:01.071152 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"750c31ab-bd58-4423-bb43-45dccd385cab","Type":"ContainerStarted","Data":"8063e75c0220463aa8c8f5be769f651202ef2ad90cb60349d106fdd478eddfca"} Mar 19 00:30:01 crc kubenswrapper[4745]: I0319 00:30:01.086594 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" podStartSLOduration=1.086571323 podStartE2EDuration="1.086571323s" podCreationTimestamp="2026-03-19 00:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:30:01.082472584 +0000 UTC m=+1365.620667715" watchObservedRunningTime="2026-03-19 00:30:01.086571323 +0000 UTC m=+1365.624766454" Mar 19 00:30:02 crc kubenswrapper[4745]: I0319 00:30:02.080997 4745 generic.go:334] "Generic (PLEG): container finished" podID="750c31ab-bd58-4423-bb43-45dccd385cab" containerID="76b872a9c674840046cfe8380bd40ec0bac4b1772b00ad0e5157a9592cf2c428" exitCode=0 Mar 19 00:30:02 crc kubenswrapper[4745]: I0319 00:30:02.081109 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"750c31ab-bd58-4423-bb43-45dccd385cab","Type":"ContainerDied","Data":"76b872a9c674840046cfe8380bd40ec0bac4b1772b00ad0e5157a9592cf2c428"} Mar 19 00:30:02 crc kubenswrapper[4745]: I0319 00:30:02.085346 4745 generic.go:334] "Generic (PLEG): container finished" podID="d9d09ccf-39bb-4eb2-8b3b-96338b5b9848" containerID="f162a6529bd91b5afff5c3c3d61fb698e5e267339659205f4beb33078fc998f2" exitCode=0 Mar 19 00:30:02 crc kubenswrapper[4745]: I0319 00:30:02.085389 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" event={"ID":"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848","Type":"ContainerDied","Data":"f162a6529bd91b5afff5c3c3d61fb698e5e267339659205f4beb33078fc998f2"} Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.094370 4745 generic.go:334] "Generic (PLEG): container finished" podID="750c31ab-bd58-4423-bb43-45dccd385cab" containerID="c3e0a9b53a8a5c2e01dc0c2e7771eef3526e9f2a8d106e37fc3e542da15fa712" exitCode=0 Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.094449 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"750c31ab-bd58-4423-bb43-45dccd385cab","Type":"ContainerDied","Data":"c3e0a9b53a8a5c2e01dc0c2e7771eef3526e9f2a8d106e37fc3e542da15fa712"} Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.097101 4745 generic.go:334] "Generic (PLEG): container finished" podID="9a566f97-13b9-4fde-868a-f55bd82a1af6" containerID="e1e89ee6fc2c85074b56c8f19c7bf183b3c352108812ec9dcafde77f229e8ca5" exitCode=0 Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.097186 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564670-8sw74" event={"ID":"9a566f97-13b9-4fde-868a-f55bd82a1af6","Type":"ContainerDied","Data":"e1e89ee6fc2c85074b56c8f19c7bf183b3c352108812ec9dcafde77f229e8ca5"} Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.157244 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_750c31ab-bd58-4423-bb43-45dccd385cab/manage-dockerfile/0.log" Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.413349 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.498051 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-secret-volume\") pod \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.498198 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-config-volume\") pod \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.498253 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmp5d\" (UniqueName: \"kubernetes.io/projected/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-kube-api-access-pmp5d\") pod \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.499256 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-config-volume" (OuterVolumeSpecName: "config-volume") pod "d9d09ccf-39bb-4eb2-8b3b-96338b5b9848" (UID: "d9d09ccf-39bb-4eb2-8b3b-96338b5b9848"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.505701 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-kube-api-access-pmp5d" (OuterVolumeSpecName: "kube-api-access-pmp5d") pod "d9d09ccf-39bb-4eb2-8b3b-96338b5b9848" (UID: "d9d09ccf-39bb-4eb2-8b3b-96338b5b9848"). InnerVolumeSpecName "kube-api-access-pmp5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.506646 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d9d09ccf-39bb-4eb2-8b3b-96338b5b9848" (UID: "d9d09ccf-39bb-4eb2-8b3b-96338b5b9848"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.599887 4745 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.600257 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmp5d\" (UniqueName: \"kubernetes.io/projected/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-kube-api-access-pmp5d\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.600341 4745 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:04 crc kubenswrapper[4745]: I0319 00:30:04.106630 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:04 crc kubenswrapper[4745]: I0319 00:30:04.107000 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" event={"ID":"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848","Type":"ContainerDied","Data":"a4a63506cc075bbeba17c49e6c84f7af721e430f03dba25dbfdcb03fbd700246"} Mar 19 00:30:04 crc kubenswrapper[4745]: I0319 00:30:04.107502 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4a63506cc075bbeba17c49e6c84f7af721e430f03dba25dbfdcb03fbd700246" Mar 19 00:30:04 crc kubenswrapper[4745]: I0319 00:30:04.109067 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"750c31ab-bd58-4423-bb43-45dccd385cab","Type":"ContainerStarted","Data":"b6a387acf56a10a05c81ffb80083f81d17823aeec4236abe8ff34638d4305403"} Mar 19 00:30:04 crc kubenswrapper[4745]: I0319 00:30:04.145527 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-2-build" podStartSLOduration=5.145505605 podStartE2EDuration="5.145505605s" podCreationTimestamp="2026-03-19 00:29:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:30:04.143651498 +0000 UTC m=+1368.681846629" watchObservedRunningTime="2026-03-19 00:30:04.145505605 +0000 UTC m=+1368.683700736" Mar 19 00:30:04 crc kubenswrapper[4745]: I0319 00:30:04.364870 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564670-8sw74" Mar 19 00:30:04 crc kubenswrapper[4745]: I0319 00:30:04.509549 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzdff\" (UniqueName: \"kubernetes.io/projected/9a566f97-13b9-4fde-868a-f55bd82a1af6-kube-api-access-tzdff\") pod \"9a566f97-13b9-4fde-868a-f55bd82a1af6\" (UID: \"9a566f97-13b9-4fde-868a-f55bd82a1af6\") " Mar 19 00:30:04 crc kubenswrapper[4745]: I0319 00:30:04.515308 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a566f97-13b9-4fde-868a-f55bd82a1af6-kube-api-access-tzdff" (OuterVolumeSpecName: "kube-api-access-tzdff") pod "9a566f97-13b9-4fde-868a-f55bd82a1af6" (UID: "9a566f97-13b9-4fde-868a-f55bd82a1af6"). InnerVolumeSpecName "kube-api-access-tzdff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:30:04 crc kubenswrapper[4745]: I0319 00:30:04.611269 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzdff\" (UniqueName: \"kubernetes.io/projected/9a566f97-13b9-4fde-868a-f55bd82a1af6-kube-api-access-tzdff\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:05 crc kubenswrapper[4745]: I0319 00:30:05.117187 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564670-8sw74" event={"ID":"9a566f97-13b9-4fde-868a-f55bd82a1af6","Type":"ContainerDied","Data":"94124bc26fea9877d2754a86a07c8c0716612e1d40c4900162d26957454a2995"} Mar 19 00:30:05 crc kubenswrapper[4745]: I0319 00:30:05.117240 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94124bc26fea9877d2754a86a07c8c0716612e1d40c4900162d26957454a2995" Mar 19 00:30:05 crc kubenswrapper[4745]: I0319 00:30:05.117239 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564670-8sw74" Mar 19 00:30:05 crc kubenswrapper[4745]: I0319 00:30:05.426211 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564664-hhtnq"] Mar 19 00:30:05 crc kubenswrapper[4745]: I0319 00:30:05.433498 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564664-hhtnq"] Mar 19 00:30:06 crc kubenswrapper[4745]: I0319 00:30:06.145794 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9a8819f-c57d-463c-9089-fbf3b29e12bc" path="/var/lib/kubelet/pods/d9a8819f-c57d-463c-9089-fbf3b29e12bc/volumes" Mar 19 00:30:15 crc kubenswrapper[4745]: I0319 00:30:15.606973 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:30:15 crc kubenswrapper[4745]: I0319 00:30:15.607945 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:30:35 crc kubenswrapper[4745]: I0319 00:30:35.376707 4745 scope.go:117] "RemoveContainer" containerID="102c360c5a32588e5b65407ade841670a288ba3942421d8abc329207a20bc972" Mar 19 00:30:45 crc kubenswrapper[4745]: I0319 00:30:45.605977 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:30:45 crc kubenswrapper[4745]: I0319 00:30:45.606593 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:30:45 crc kubenswrapper[4745]: I0319 00:30:45.606656 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:30:45 crc kubenswrapper[4745]: I0319 00:30:45.608003 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"596b4f8d9ce25b5331069b1483120c23162574341fd13a90efeccafcfe8087f5"} pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 00:30:45 crc kubenswrapper[4745]: I0319 00:30:45.608077 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" containerID="cri-o://596b4f8d9ce25b5331069b1483120c23162574341fd13a90efeccafcfe8087f5" gracePeriod=600 Mar 19 00:30:46 crc kubenswrapper[4745]: I0319 00:30:46.408039 4745 generic.go:334] "Generic (PLEG): container finished" podID="400972f4-050f-4f26-b982-ced6f2590c8b" containerID="596b4f8d9ce25b5331069b1483120c23162574341fd13a90efeccafcfe8087f5" exitCode=0 Mar 19 00:30:46 crc kubenswrapper[4745]: I0319 00:30:46.408148 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerDied","Data":"596b4f8d9ce25b5331069b1483120c23162574341fd13a90efeccafcfe8087f5"} Mar 19 00:30:46 crc kubenswrapper[4745]: I0319 00:30:46.408539 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2"} Mar 19 00:30:46 crc kubenswrapper[4745]: I0319 00:30:46.408571 4745 scope.go:117] "RemoveContainer" containerID="b78af311b5799c4bf0cb1c94c0266d9488b1808ed2c9ef987dfc6da2fa0eaa89" Mar 19 00:30:47 crc kubenswrapper[4745]: I0319 00:30:47.418644 4745 generic.go:334] "Generic (PLEG): container finished" podID="750c31ab-bd58-4423-bb43-45dccd385cab" containerID="b6a387acf56a10a05c81ffb80083f81d17823aeec4236abe8ff34638d4305403" exitCode=0 Mar 19 00:30:47 crc kubenswrapper[4745]: I0319 00:30:47.418716 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"750c31ab-bd58-4423-bb43-45dccd385cab","Type":"ContainerDied","Data":"b6a387acf56a10a05c81ffb80083f81d17823aeec4236abe8ff34638d4305403"} Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.714570 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785004 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-proxy-ca-bundles\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785055 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-node-pullsecrets\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785104 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-system-configs\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785131 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-build-blob-cache\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785164 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-push\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785185 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-buildcachedir\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785230 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-root\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785261 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-pull\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785306 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bj7z\" (UniqueName: \"kubernetes.io/projected/750c31ab-bd58-4423-bb43-45dccd385cab-kube-api-access-2bj7z\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785430 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-buildworkdir\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785488 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-run\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785515 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-ca-bundles\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785737 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785975 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.786393 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.786446 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.786476 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.786563 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.788657 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.789270 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.794626 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.797009 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.802532 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/750c31ab-bd58-4423-bb43-45dccd385cab-kube-api-access-2bj7z" (OuterVolumeSpecName: "kube-api-access-2bj7z") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "kube-api-access-2bj7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.888088 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.888167 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.888189 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.888209 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.888225 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.888240 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bj7z\" (UniqueName: \"kubernetes.io/projected/750c31ab-bd58-4423-bb43-45dccd385cab-kube-api-access-2bj7z\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.888256 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.888302 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.888318 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.918133 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.989979 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:49 crc kubenswrapper[4745]: I0319 00:30:49.443705 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"750c31ab-bd58-4423-bb43-45dccd385cab","Type":"ContainerDied","Data":"8063e75c0220463aa8c8f5be769f651202ef2ad90cb60349d106fdd478eddfca"} Mar 19 00:30:49 crc kubenswrapper[4745]: I0319 00:30:49.443760 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8063e75c0220463aa8c8f5be769f651202ef2ad90cb60349d106fdd478eddfca" Mar 19 00:30:49 crc kubenswrapper[4745]: I0319 00:30:49.443902 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 19 00:30:49 crc kubenswrapper[4745]: I0319 00:30:49.559422 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:30:49 crc kubenswrapper[4745]: I0319 00:30:49.601207 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.881799 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 19 00:30:53 crc kubenswrapper[4745]: E0319 00:30:53.882776 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750c31ab-bd58-4423-bb43-45dccd385cab" containerName="manage-dockerfile" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.882790 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="750c31ab-bd58-4423-bb43-45dccd385cab" containerName="manage-dockerfile" Mar 19 00:30:53 crc kubenswrapper[4745]: E0319 00:30:53.882803 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a566f97-13b9-4fde-868a-f55bd82a1af6" containerName="oc" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.882809 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a566f97-13b9-4fde-868a-f55bd82a1af6" containerName="oc" Mar 19 00:30:53 crc kubenswrapper[4745]: E0319 00:30:53.882818 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750c31ab-bd58-4423-bb43-45dccd385cab" containerName="git-clone" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.882825 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="750c31ab-bd58-4423-bb43-45dccd385cab" containerName="git-clone" Mar 19 00:30:53 crc kubenswrapper[4745]: E0319 00:30:53.882836 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d09ccf-39bb-4eb2-8b3b-96338b5b9848" containerName="collect-profiles" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.882841 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d09ccf-39bb-4eb2-8b3b-96338b5b9848" containerName="collect-profiles" Mar 19 00:30:53 crc kubenswrapper[4745]: E0319 00:30:53.882853 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750c31ab-bd58-4423-bb43-45dccd385cab" containerName="docker-build" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.882858 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="750c31ab-bd58-4423-bb43-45dccd385cab" containerName="docker-build" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.882986 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="750c31ab-bd58-4423-bb43-45dccd385cab" containerName="docker-build" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.882999 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9d09ccf-39bb-4eb2-8b3b-96338b5b9848" containerName="collect-profiles" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.883011 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a566f97-13b9-4fde-868a-f55bd82a1af6" containerName="oc" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.883659 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.885859 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-sys-config" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.886386 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-global-ca" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.886835 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.887243 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-ca" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.899132 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.062158 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.062684 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.062714 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.062793 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.062830 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.062862 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.062901 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdcqg\" (UniqueName: \"kubernetes.io/projected/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-kube-api-access-sdcqg\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.063063 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.063117 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.063160 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.063187 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.063346 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164294 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164363 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164382 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdcqg\" (UniqueName: \"kubernetes.io/projected/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-kube-api-access-sdcqg\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164406 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164432 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164454 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164475 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164496 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164527 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164557 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164589 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164643 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164645 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164584 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.165110 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.165338 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.166230 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.166556 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.166794 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.166941 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.167106 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.171375 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.179599 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.187702 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdcqg\" (UniqueName: \"kubernetes.io/projected/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-kube-api-access-sdcqg\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.203669 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.606692 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 19 00:30:55 crc kubenswrapper[4745]: I0319 00:30:55.485534 4745 generic.go:334] "Generic (PLEG): container finished" podID="cba4406e-8dc7-4dc5-a7c0-7778f01d1028" containerID="8e243def335e49da6a1d57cd97c25f6d3979c96c8c683baa98a92a5cca110908" exitCode=0 Mar 19 00:30:55 crc kubenswrapper[4745]: I0319 00:30:55.485660 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"cba4406e-8dc7-4dc5-a7c0-7778f01d1028","Type":"ContainerDied","Data":"8e243def335e49da6a1d57cd97c25f6d3979c96c8c683baa98a92a5cca110908"} Mar 19 00:30:55 crc kubenswrapper[4745]: I0319 00:30:55.486019 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"cba4406e-8dc7-4dc5-a7c0-7778f01d1028","Type":"ContainerStarted","Data":"d99deadfe1fdaec34e88fd1333d4bf6a3807b8f55f4f7cd2551dba743f9ec241"} Mar 19 00:30:56 crc kubenswrapper[4745]: I0319 00:30:56.502375 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"cba4406e-8dc7-4dc5-a7c0-7778f01d1028","Type":"ContainerStarted","Data":"81f9b5877c26083562eeca7126b020fa88cfcf8ed70c30b3b2a8bcc49ea467a7"} Mar 19 00:30:56 crc kubenswrapper[4745]: I0319 00:30:56.530821 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=3.530797326 podStartE2EDuration="3.530797326s" podCreationTimestamp="2026-03-19 00:30:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:30:56.530281669 +0000 UTC m=+1421.068476810" watchObservedRunningTime="2026-03-19 00:30:56.530797326 +0000 UTC m=+1421.068992457" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.287734 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.289342 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="cba4406e-8dc7-4dc5-a7c0-7778f01d1028" containerName="docker-build" containerID="cri-o://81f9b5877c26083562eeca7126b020fa88cfcf8ed70c30b3b2a8bcc49ea467a7" gracePeriod=30 Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.552211 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_cba4406e-8dc7-4dc5-a7c0-7778f01d1028/docker-build/0.log" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.553112 4745 generic.go:334] "Generic (PLEG): container finished" podID="cba4406e-8dc7-4dc5-a7c0-7778f01d1028" containerID="81f9b5877c26083562eeca7126b020fa88cfcf8ed70c30b3b2a8bcc49ea467a7" exitCode=1 Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.553172 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"cba4406e-8dc7-4dc5-a7c0-7778f01d1028","Type":"ContainerDied","Data":"81f9b5877c26083562eeca7126b020fa88cfcf8ed70c30b3b2a8bcc49ea467a7"} Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.689768 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_cba4406e-8dc7-4dc5-a7c0-7778f01d1028/docker-build/0.log" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.690197 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718420 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-system-configs\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718562 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-blob-cache\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718621 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildcachedir\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718660 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-push\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718703 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-proxy-ca-bundles\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718725 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-pull\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718757 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-run\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718805 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-ca-bundles\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718833 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-root\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718864 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildworkdir\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718929 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-node-pullsecrets\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718983 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdcqg\" (UniqueName: \"kubernetes.io/projected/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-kube-api-access-sdcqg\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.719282 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.719398 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.719928 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.720370 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.720433 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.720610 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.719819 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.736859 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.737085 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.737125 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-kube-api-access-sdcqg" (OuterVolumeSpecName: "kube-api-access-sdcqg") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "kube-api-access-sdcqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.796684 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820573 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820658 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820672 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820681 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820690 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820699 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820743 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820756 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820765 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdcqg\" (UniqueName: \"kubernetes.io/projected/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-kube-api-access-sdcqg\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820795 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820804 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.120779 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.124411 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.562388 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_cba4406e-8dc7-4dc5-a7c0-7778f01d1028/docker-build/0.log" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.563009 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"cba4406e-8dc7-4dc5-a7c0-7778f01d1028","Type":"ContainerDied","Data":"d99deadfe1fdaec34e88fd1333d4bf6a3807b8f55f4f7cd2551dba743f9ec241"} Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.563070 4745 scope.go:117] "RemoveContainer" containerID="81f9b5877c26083562eeca7126b020fa88cfcf8ed70c30b3b2a8bcc49ea467a7" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.563304 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.592065 4745 scope.go:117] "RemoveContainer" containerID="8e243def335e49da6a1d57cd97c25f6d3979c96c8c683baa98a92a5cca110908" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.619249 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.623639 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.992973 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 19 00:31:05 crc kubenswrapper[4745]: E0319 00:31:05.993525 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba4406e-8dc7-4dc5-a7c0-7778f01d1028" containerName="docker-build" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.993545 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba4406e-8dc7-4dc5-a7c0-7778f01d1028" containerName="docker-build" Mar 19 00:31:05 crc kubenswrapper[4745]: E0319 00:31:05.993557 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba4406e-8dc7-4dc5-a7c0-7778f01d1028" containerName="manage-dockerfile" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.993564 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba4406e-8dc7-4dc5-a7c0-7778f01d1028" containerName="manage-dockerfile" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.994657 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba4406e-8dc7-4dc5-a7c0-7778f01d1028" containerName="docker-build" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.997066 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.999193 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-sys-config" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.000967 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.001269 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-global-ca" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.002476 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-ca" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.015717 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040335 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040419 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040479 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040516 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040553 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040586 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nvmj\" (UniqueName: \"kubernetes.io/projected/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-kube-api-access-6nvmj\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040654 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040681 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040711 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040741 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040987 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.041113 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143094 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143144 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143185 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143217 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143277 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143299 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143319 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143338 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nvmj\" (UniqueName: \"kubernetes.io/projected/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-kube-api-access-6nvmj\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143357 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143377 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143399 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143427 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143456 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143574 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143947 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.144338 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.144352 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.144480 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.144725 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.144853 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.145230 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.146152 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba4406e-8dc7-4dc5-a7c0-7778f01d1028" path="/var/lib/kubelet/pods/cba4406e-8dc7-4dc5-a7c0-7778f01d1028/volumes" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.148950 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.151233 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.167094 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nvmj\" (UniqueName: \"kubernetes.io/projected/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-kube-api-access-6nvmj\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.318290 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.526686 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 19 00:31:06 crc kubenswrapper[4745]: W0319 00:31:06.533789 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3cc8a26_ff02_4e22_b924_7ba0a0bf0cdb.slice/crio-8045e150559a823c4c3cf44e98e6b35528c4a4d13121e74266e16e52f4f60ab6 WatchSource:0}: Error finding container 8045e150559a823c4c3cf44e98e6b35528c4a4d13121e74266e16e52f4f60ab6: Status 404 returned error can't find the container with id 8045e150559a823c4c3cf44e98e6b35528c4a4d13121e74266e16e52f4f60ab6 Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.573355 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb","Type":"ContainerStarted","Data":"8045e150559a823c4c3cf44e98e6b35528c4a4d13121e74266e16e52f4f60ab6"} Mar 19 00:31:07 crc kubenswrapper[4745]: I0319 00:31:07.583262 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb","Type":"ContainerStarted","Data":"12de399145ebf4351ca99846af1271e3bc7ba9b124e5e71af2424115e30a6462"} Mar 19 00:31:08 crc kubenswrapper[4745]: I0319 00:31:08.617969 4745 generic.go:334] "Generic (PLEG): container finished" podID="b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" containerID="12de399145ebf4351ca99846af1271e3bc7ba9b124e5e71af2424115e30a6462" exitCode=0 Mar 19 00:31:08 crc kubenswrapper[4745]: I0319 00:31:08.618037 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb","Type":"ContainerDied","Data":"12de399145ebf4351ca99846af1271e3bc7ba9b124e5e71af2424115e30a6462"} Mar 19 00:31:09 crc kubenswrapper[4745]: I0319 00:31:09.632309 4745 generic.go:334] "Generic (PLEG): container finished" podID="b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" containerID="6caa75a3508ec6bf1e70827ac043333a39f59633de038b233135fa516bdf907e" exitCode=0 Mar 19 00:31:09 crc kubenswrapper[4745]: I0319 00:31:09.632439 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb","Type":"ContainerDied","Data":"6caa75a3508ec6bf1e70827ac043333a39f59633de038b233135fa516bdf907e"} Mar 19 00:31:09 crc kubenswrapper[4745]: I0319 00:31:09.684433 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb/manage-dockerfile/0.log" Mar 19 00:31:10 crc kubenswrapper[4745]: I0319 00:31:10.640857 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb","Type":"ContainerStarted","Data":"183bda858c89f1750355a842d8b2c90e06898d12268cec8ffa2340518a400206"} Mar 19 00:31:10 crc kubenswrapper[4745]: I0319 00:31:10.675573 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=5.675542542 podStartE2EDuration="5.675542542s" podCreationTimestamp="2026-03-19 00:31:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:31:10.670850145 +0000 UTC m=+1435.209045286" watchObservedRunningTime="2026-03-19 00:31:10.675542542 +0000 UTC m=+1435.213737693" Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.145984 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564672-tpcgv"] Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.147394 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564672-tpcgv" Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.151634 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.151800 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.152772 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.154118 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564672-tpcgv"] Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.168234 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2w9f\" (UniqueName: \"kubernetes.io/projected/550c50ae-5519-4c0d-b2b0-7415d134808f-kube-api-access-c2w9f\") pod \"auto-csr-approver-29564672-tpcgv\" (UID: \"550c50ae-5519-4c0d-b2b0-7415d134808f\") " pod="openshift-infra/auto-csr-approver-29564672-tpcgv" Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.270634 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2w9f\" (UniqueName: \"kubernetes.io/projected/550c50ae-5519-4c0d-b2b0-7415d134808f-kube-api-access-c2w9f\") pod \"auto-csr-approver-29564672-tpcgv\" (UID: \"550c50ae-5519-4c0d-b2b0-7415d134808f\") " pod="openshift-infra/auto-csr-approver-29564672-tpcgv" Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.291680 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2w9f\" (UniqueName: \"kubernetes.io/projected/550c50ae-5519-4c0d-b2b0-7415d134808f-kube-api-access-c2w9f\") pod \"auto-csr-approver-29564672-tpcgv\" (UID: \"550c50ae-5519-4c0d-b2b0-7415d134808f\") " pod="openshift-infra/auto-csr-approver-29564672-tpcgv" Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.470850 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564672-tpcgv" Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.707916 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564672-tpcgv"] Mar 19 00:32:01 crc kubenswrapper[4745]: I0319 00:32:01.306059 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564672-tpcgv" event={"ID":"550c50ae-5519-4c0d-b2b0-7415d134808f","Type":"ContainerStarted","Data":"0748e92f88b6a9584f3470bc17753ccc9bd18b6a77cecd36fb08ab3c8f42cac9"} Mar 19 00:32:02 crc kubenswrapper[4745]: I0319 00:32:02.313959 4745 generic.go:334] "Generic (PLEG): container finished" podID="550c50ae-5519-4c0d-b2b0-7415d134808f" containerID="ad4d25cdb1eb7abf3f1713a4b642271a43e4a8fa68c0fb36024884e82f682adb" exitCode=0 Mar 19 00:32:02 crc kubenswrapper[4745]: I0319 00:32:02.314013 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564672-tpcgv" event={"ID":"550c50ae-5519-4c0d-b2b0-7415d134808f","Type":"ContainerDied","Data":"ad4d25cdb1eb7abf3f1713a4b642271a43e4a8fa68c0fb36024884e82f682adb"} Mar 19 00:32:02 crc kubenswrapper[4745]: I0319 00:32:02.316991 4745 generic.go:334] "Generic (PLEG): container finished" podID="b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" containerID="183bda858c89f1750355a842d8b2c90e06898d12268cec8ffa2340518a400206" exitCode=0 Mar 19 00:32:02 crc kubenswrapper[4745]: I0319 00:32:02.317030 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb","Type":"ContainerDied","Data":"183bda858c89f1750355a842d8b2c90e06898d12268cec8ffa2340518a400206"} Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.689354 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.693010 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564672-tpcgv" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.784968 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-blob-cache\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785054 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-push\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785090 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-pull\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785117 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildworkdir\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785145 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildcachedir\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785173 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-system-configs\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785191 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-run\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785249 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-node-pullsecrets\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785272 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785325 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-proxy-ca-bundles\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785347 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-root\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785371 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-ca-bundles\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785423 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2w9f\" (UniqueName: \"kubernetes.io/projected/550c50ae-5519-4c0d-b2b0-7415d134808f-kube-api-access-c2w9f\") pod \"550c50ae-5519-4c0d-b2b0-7415d134808f\" (UID: \"550c50ae-5519-4c0d-b2b0-7415d134808f\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785447 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nvmj\" (UniqueName: \"kubernetes.io/projected/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-kube-api-access-6nvmj\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785789 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.786367 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.786443 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.786860 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.786893 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.787215 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.788554 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.792370 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-kube-api-access-6nvmj" (OuterVolumeSpecName: "kube-api-access-6nvmj") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "kube-api-access-6nvmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.793283 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.794549 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.815892 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/550c50ae-5519-4c0d-b2b0-7415d134808f-kube-api-access-c2w9f" (OuterVolumeSpecName: "kube-api-access-c2w9f") pod "550c50ae-5519-4c0d-b2b0-7415d134808f" (UID: "550c50ae-5519-4c0d-b2b0-7415d134808f"). InnerVolumeSpecName "kube-api-access-c2w9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.886910 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nvmj\" (UniqueName: \"kubernetes.io/projected/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-kube-api-access-6nvmj\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.886950 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.886962 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.886975 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.886986 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.886995 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.887003 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.887011 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.887020 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.887029 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2w9f\" (UniqueName: \"kubernetes.io/projected/550c50ae-5519-4c0d-b2b0-7415d134808f-kube-api-access-c2w9f\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.903763 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.988793 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:04 crc kubenswrapper[4745]: I0319 00:32:04.335328 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb","Type":"ContainerDied","Data":"8045e150559a823c4c3cf44e98e6b35528c4a4d13121e74266e16e52f4f60ab6"} Mar 19 00:32:04 crc kubenswrapper[4745]: I0319 00:32:04.335690 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8045e150559a823c4c3cf44e98e6b35528c4a4d13121e74266e16e52f4f60ab6" Mar 19 00:32:04 crc kubenswrapper[4745]: I0319 00:32:04.335405 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:32:04 crc kubenswrapper[4745]: I0319 00:32:04.337408 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564672-tpcgv" event={"ID":"550c50ae-5519-4c0d-b2b0-7415d134808f","Type":"ContainerDied","Data":"0748e92f88b6a9584f3470bc17753ccc9bd18b6a77cecd36fb08ab3c8f42cac9"} Mar 19 00:32:04 crc kubenswrapper[4745]: I0319 00:32:04.337437 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0748e92f88b6a9584f3470bc17753ccc9bd18b6a77cecd36fb08ab3c8f42cac9" Mar 19 00:32:04 crc kubenswrapper[4745]: I0319 00:32:04.337498 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564672-tpcgv" Mar 19 00:32:04 crc kubenswrapper[4745]: I0319 00:32:04.676212 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:04 crc kubenswrapper[4745]: I0319 00:32:04.703013 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:04 crc kubenswrapper[4745]: I0319 00:32:04.786073 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564666-lrgz2"] Mar 19 00:32:04 crc kubenswrapper[4745]: I0319 00:32:04.791932 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564666-lrgz2"] Mar 19 00:32:06 crc kubenswrapper[4745]: I0319 00:32:06.146117 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fad60f0-0471-47eb-af8b-85d8a4a0c52f" path="/var/lib/kubelet/pods/6fad60f0-0471-47eb-af8b-85d8a4a0c52f/volumes" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.380907 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cwrwg"] Mar 19 00:32:10 crc kubenswrapper[4745]: E0319 00:32:10.381866 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" containerName="docker-build" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.381897 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" containerName="docker-build" Mar 19 00:32:10 crc kubenswrapper[4745]: E0319 00:32:10.381916 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" containerName="git-clone" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.381921 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" containerName="git-clone" Mar 19 00:32:10 crc kubenswrapper[4745]: E0319 00:32:10.381930 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" containerName="manage-dockerfile" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.381938 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" containerName="manage-dockerfile" Mar 19 00:32:10 crc kubenswrapper[4745]: E0319 00:32:10.381951 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="550c50ae-5519-4c0d-b2b0-7415d134808f" containerName="oc" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.381956 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="550c50ae-5519-4c0d-b2b0-7415d134808f" containerName="oc" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.382057 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" containerName="docker-build" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.382072 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="550c50ae-5519-4c0d-b2b0-7415d134808f" containerName="oc" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.383009 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.384945 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-catalog-content\") pod \"redhat-operators-cwrwg\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.385028 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4wz7\" (UniqueName: \"kubernetes.io/projected/197ef947-3c30-4a50-ade4-01f72410e5cf-kube-api-access-n4wz7\") pod \"redhat-operators-cwrwg\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.385096 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-utilities\") pod \"redhat-operators-cwrwg\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.392545 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cwrwg"] Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.485945 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-utilities\") pod \"redhat-operators-cwrwg\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.486023 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-catalog-content\") pod \"redhat-operators-cwrwg\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.486070 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4wz7\" (UniqueName: \"kubernetes.io/projected/197ef947-3c30-4a50-ade4-01f72410e5cf-kube-api-access-n4wz7\") pod \"redhat-operators-cwrwg\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.486754 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-catalog-content\") pod \"redhat-operators-cwrwg\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.487034 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-utilities\") pod \"redhat-operators-cwrwg\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.512906 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4wz7\" (UniqueName: \"kubernetes.io/projected/197ef947-3c30-4a50-ade4-01f72410e5cf-kube-api-access-n4wz7\") pod \"redhat-operators-cwrwg\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.703515 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:11 crc kubenswrapper[4745]: I0319 00:32:11.111062 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cwrwg"] Mar 19 00:32:11 crc kubenswrapper[4745]: I0319 00:32:11.382099 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwrwg" event={"ID":"197ef947-3c30-4a50-ade4-01f72410e5cf","Type":"ContainerStarted","Data":"9f4642b49f8bb7f219967f76570586d6870f2fbf5d9b181304387179ff6fbf3b"} Mar 19 00:32:12 crc kubenswrapper[4745]: I0319 00:32:12.391259 4745 generic.go:334] "Generic (PLEG): container finished" podID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerID="6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8" exitCode=0 Mar 19 00:32:12 crc kubenswrapper[4745]: I0319 00:32:12.391321 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwrwg" event={"ID":"197ef947-3c30-4a50-ade4-01f72410e5cf","Type":"ContainerDied","Data":"6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8"} Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.026771 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.028286 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.030447 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-sys-config" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.030670 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.030711 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-ca" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.030765 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-global-ca" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.056091 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.134753 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.134857 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.134897 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.134916 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.134936 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cngl\" (UniqueName: \"kubernetes.io/projected/7b87c190-e4f0-4423-bcb7-942badcf90a9-kube-api-access-9cngl\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.134953 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.134987 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.135031 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.135083 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.135104 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.135127 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.135149 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236344 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236404 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236435 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236466 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236510 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236544 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236572 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236598 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236638 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236660 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cngl\" (UniqueName: \"kubernetes.io/projected/7b87c190-e4f0-4423-bcb7-942badcf90a9-kube-api-access-9cngl\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236685 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236737 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236817 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236957 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.237139 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.237172 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.237377 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.237615 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.237859 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.237918 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.238661 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.243005 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.246402 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.263382 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cngl\" (UniqueName: \"kubernetes.io/projected/7b87c190-e4f0-4423-bcb7-942badcf90a9-kube-api-access-9cngl\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.345751 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.434840 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwrwg" event={"ID":"197ef947-3c30-4a50-ade4-01f72410e5cf","Type":"ContainerStarted","Data":"fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7"} Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.691656 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 19 00:32:13 crc kubenswrapper[4745]: W0319 00:32:13.693009 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b87c190_e4f0_4423_bcb7_942badcf90a9.slice/crio-58b49faa453787cdddcf37add99bc7b8bfd7ed03549b92efb668128ff84b7fc9 WatchSource:0}: Error finding container 58b49faa453787cdddcf37add99bc7b8bfd7ed03549b92efb668128ff84b7fc9: Status 404 returned error can't find the container with id 58b49faa453787cdddcf37add99bc7b8bfd7ed03549b92efb668128ff84b7fc9 Mar 19 00:32:14 crc kubenswrapper[4745]: I0319 00:32:14.444353 4745 generic.go:334] "Generic (PLEG): container finished" podID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerID="fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7" exitCode=0 Mar 19 00:32:14 crc kubenswrapper[4745]: I0319 00:32:14.444459 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwrwg" event={"ID":"197ef947-3c30-4a50-ade4-01f72410e5cf","Type":"ContainerDied","Data":"fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7"} Mar 19 00:32:14 crc kubenswrapper[4745]: I0319 00:32:14.447529 4745 generic.go:334] "Generic (PLEG): container finished" podID="7b87c190-e4f0-4423-bcb7-942badcf90a9" containerID="974a4b2b0120bf7547c402b35bf7ecab55db0da6f49394541dc6bc7af4cdda92" exitCode=0 Mar 19 00:32:14 crc kubenswrapper[4745]: I0319 00:32:14.447658 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"7b87c190-e4f0-4423-bcb7-942badcf90a9","Type":"ContainerDied","Data":"974a4b2b0120bf7547c402b35bf7ecab55db0da6f49394541dc6bc7af4cdda92"} Mar 19 00:32:14 crc kubenswrapper[4745]: I0319 00:32:14.447779 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"7b87c190-e4f0-4423-bcb7-942badcf90a9","Type":"ContainerStarted","Data":"58b49faa453787cdddcf37add99bc7b8bfd7ed03549b92efb668128ff84b7fc9"} Mar 19 00:32:15 crc kubenswrapper[4745]: I0319 00:32:15.460531 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"7b87c190-e4f0-4423-bcb7-942badcf90a9","Type":"ContainerStarted","Data":"f313fbb3c21be61114f257490bb0a77393588572276eeaf994f032d21e90ad1a"} Mar 19 00:32:15 crc kubenswrapper[4745]: I0319 00:32:15.495136 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-1-build" podStartSLOduration=2.495104659 podStartE2EDuration="2.495104659s" podCreationTimestamp="2026-03-19 00:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:32:15.494227733 +0000 UTC m=+1500.032422884" watchObservedRunningTime="2026-03-19 00:32:15.495104659 +0000 UTC m=+1500.033299790" Mar 19 00:32:19 crc kubenswrapper[4745]: I0319 00:32:19.510457 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwrwg" event={"ID":"197ef947-3c30-4a50-ade4-01f72410e5cf","Type":"ContainerStarted","Data":"ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d"} Mar 19 00:32:19 crc kubenswrapper[4745]: I0319 00:32:19.536767 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cwrwg" podStartSLOduration=6.0298041 podStartE2EDuration="9.536745421s" podCreationTimestamp="2026-03-19 00:32:10 +0000 UTC" firstStartedPulling="2026-03-19 00:32:12.394078698 +0000 UTC m=+1496.932273829" lastFinishedPulling="2026-03-19 00:32:15.901020019 +0000 UTC m=+1500.439215150" observedRunningTime="2026-03-19 00:32:19.535572594 +0000 UTC m=+1504.073767735" watchObservedRunningTime="2026-03-19 00:32:19.536745421 +0000 UTC m=+1504.074940542" Mar 19 00:32:20 crc kubenswrapper[4745]: I0319 00:32:20.519688 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_7b87c190-e4f0-4423-bcb7-942badcf90a9/docker-build/0.log" Mar 19 00:32:20 crc kubenswrapper[4745]: I0319 00:32:20.521456 4745 generic.go:334] "Generic (PLEG): container finished" podID="7b87c190-e4f0-4423-bcb7-942badcf90a9" containerID="f313fbb3c21be61114f257490bb0a77393588572276eeaf994f032d21e90ad1a" exitCode=1 Mar 19 00:32:20 crc kubenswrapper[4745]: I0319 00:32:20.521589 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"7b87c190-e4f0-4423-bcb7-942badcf90a9","Type":"ContainerDied","Data":"f313fbb3c21be61114f257490bb0a77393588572276eeaf994f032d21e90ad1a"} Mar 19 00:32:20 crc kubenswrapper[4745]: I0319 00:32:20.704635 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:20 crc kubenswrapper[4745]: I0319 00:32:20.705129 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.752441 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cwrwg" podUID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerName="registry-server" probeResult="failure" output=< Mar 19 00:32:21 crc kubenswrapper[4745]: timeout: failed to connect service ":50051" within 1s Mar 19 00:32:21 crc kubenswrapper[4745]: > Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.784123 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_7b87c190-e4f0-4423-bcb7-942badcf90a9/docker-build/0.log" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.784559 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880351 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-node-pullsecrets\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880436 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-push\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880485 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-pull\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880494 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880527 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildcachedir\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880554 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-ca-bundles\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880610 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildworkdir\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880633 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-system-configs\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880664 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cngl\" (UniqueName: \"kubernetes.io/projected/7b87c190-e4f0-4423-bcb7-942badcf90a9-kube-api-access-9cngl\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880697 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-run\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880703 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880966 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-proxy-ca-bundles\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.881117 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-blob-cache\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.881140 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-root\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.881804 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.882389 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.882269 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.882291 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.882728 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.882731 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.882973 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.882997 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.883456 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.888604 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.888668 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b87c190-e4f0-4423-bcb7-942badcf90a9-kube-api-access-9cngl" (OuterVolumeSpecName: "kube-api-access-9cngl") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "kube-api-access-9cngl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.889048 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.983771 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.983811 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.983823 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.983832 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.983843 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.983851 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.983863 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.983873 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.983908 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cngl\" (UniqueName: \"kubernetes.io/projected/7b87c190-e4f0-4423-bcb7-942badcf90a9-kube-api-access-9cngl\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.983917 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:22 crc kubenswrapper[4745]: I0319 00:32:22.537848 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_7b87c190-e4f0-4423-bcb7-942badcf90a9/docker-build/0.log" Mar 19 00:32:22 crc kubenswrapper[4745]: I0319 00:32:22.538444 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"7b87c190-e4f0-4423-bcb7-942badcf90a9","Type":"ContainerDied","Data":"58b49faa453787cdddcf37add99bc7b8bfd7ed03549b92efb668128ff84b7fc9"} Mar 19 00:32:22 crc kubenswrapper[4745]: I0319 00:32:22.538501 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58b49faa453787cdddcf37add99bc7b8bfd7ed03549b92efb668128ff84b7fc9" Mar 19 00:32:22 crc kubenswrapper[4745]: I0319 00:32:22.538734 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:23 crc kubenswrapper[4745]: I0319 00:32:23.523057 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 19 00:32:23 crc kubenswrapper[4745]: I0319 00:32:23.529873 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 19 00:32:24 crc kubenswrapper[4745]: I0319 00:32:24.147316 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b87c190-e4f0-4423-bcb7-942badcf90a9" path="/var/lib/kubelet/pods/7b87c190-e4f0-4423-bcb7-942badcf90a9/volumes" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.206329 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 19 00:32:25 crc kubenswrapper[4745]: E0319 00:32:25.206667 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b87c190-e4f0-4423-bcb7-942badcf90a9" containerName="docker-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.206681 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b87c190-e4f0-4423-bcb7-942badcf90a9" containerName="docker-build" Mar 19 00:32:25 crc kubenswrapper[4745]: E0319 00:32:25.206693 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b87c190-e4f0-4423-bcb7-942badcf90a9" containerName="manage-dockerfile" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.206700 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b87c190-e4f0-4423-bcb7-942badcf90a9" containerName="manage-dockerfile" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.206809 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b87c190-e4f0-4423-bcb7-942badcf90a9" containerName="docker-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.209193 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.212084 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-sys-config" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.212084 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-global-ca" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.212089 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.212686 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-ca" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.228272 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.334738 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.334782 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.334808 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.334835 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.334855 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.335072 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.335137 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.335378 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.335446 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.335506 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.335602 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvpkr\" (UniqueName: \"kubernetes.io/projected/00084aa5-c66b-4c62-a0c3-422e3c02286a-kube-api-access-lvpkr\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.335701 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.436873 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvpkr\" (UniqueName: \"kubernetes.io/projected/00084aa5-c66b-4c62-a0c3-422e3c02286a-kube-api-access-lvpkr\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.437274 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.437450 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.437547 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.437452 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.437630 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.437803 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.437987 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.438068 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.438108 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.438212 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.438339 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.438490 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.438621 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.438745 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.438519 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.438794 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.439007 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.439117 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.439385 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.439608 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.444246 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.444297 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.458922 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvpkr\" (UniqueName: \"kubernetes.io/projected/00084aa5-c66b-4c62-a0c3-422e3c02286a-kube-api-access-lvpkr\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.527070 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.765293 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 19 00:32:26 crc kubenswrapper[4745]: I0319 00:32:26.571604 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"00084aa5-c66b-4c62-a0c3-422e3c02286a","Type":"ContainerStarted","Data":"34a689b37d20c977ba6ca58741a9591bcdc32db9ecb4715c1a496a9b03fde054"} Mar 19 00:32:26 crc kubenswrapper[4745]: I0319 00:32:26.572011 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"00084aa5-c66b-4c62-a0c3-422e3c02286a","Type":"ContainerStarted","Data":"95124c5725c9d01c054231894a7b4772b99823bf7deea19e1ef7de9c84f0ca6e"} Mar 19 00:32:27 crc kubenswrapper[4745]: I0319 00:32:27.580327 4745 generic.go:334] "Generic (PLEG): container finished" podID="00084aa5-c66b-4c62-a0c3-422e3c02286a" containerID="34a689b37d20c977ba6ca58741a9591bcdc32db9ecb4715c1a496a9b03fde054" exitCode=0 Mar 19 00:32:27 crc kubenswrapper[4745]: I0319 00:32:27.580386 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"00084aa5-c66b-4c62-a0c3-422e3c02286a","Type":"ContainerDied","Data":"34a689b37d20c977ba6ca58741a9591bcdc32db9ecb4715c1a496a9b03fde054"} Mar 19 00:32:28 crc kubenswrapper[4745]: I0319 00:32:28.594202 4745 generic.go:334] "Generic (PLEG): container finished" podID="00084aa5-c66b-4c62-a0c3-422e3c02286a" containerID="9742c972b6ec6febdd5f2783832bb25d24125697d9b00e4733b8583e4a9188f1" exitCode=0 Mar 19 00:32:28 crc kubenswrapper[4745]: I0319 00:32:28.594391 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"00084aa5-c66b-4c62-a0c3-422e3c02286a","Type":"ContainerDied","Data":"9742c972b6ec6febdd5f2783832bb25d24125697d9b00e4733b8583e4a9188f1"} Mar 19 00:32:28 crc kubenswrapper[4745]: I0319 00:32:28.632870 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-2-build_00084aa5-c66b-4c62-a0c3-422e3c02286a/manage-dockerfile/0.log" Mar 19 00:32:29 crc kubenswrapper[4745]: I0319 00:32:29.605237 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"00084aa5-c66b-4c62-a0c3-422e3c02286a","Type":"ContainerStarted","Data":"b4ec71f0453639d110d361d2fd4ca7b279631514bab4c9ba34ffe4b52644ca79"} Mar 19 00:32:29 crc kubenswrapper[4745]: I0319 00:32:29.631554 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-2-build" podStartSLOduration=4.631528646 podStartE2EDuration="4.631528646s" podCreationTimestamp="2026-03-19 00:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:32:29.631152714 +0000 UTC m=+1514.169347865" watchObservedRunningTime="2026-03-19 00:32:29.631528646 +0000 UTC m=+1514.169723787" Mar 19 00:32:30 crc kubenswrapper[4745]: I0319 00:32:30.747843 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:30 crc kubenswrapper[4745]: I0319 00:32:30.792962 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:30 crc kubenswrapper[4745]: I0319 00:32:30.990954 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cwrwg"] Mar 19 00:32:31 crc kubenswrapper[4745]: I0319 00:32:31.623027 4745 generic.go:334] "Generic (PLEG): container finished" podID="00084aa5-c66b-4c62-a0c3-422e3c02286a" containerID="b4ec71f0453639d110d361d2fd4ca7b279631514bab4c9ba34ffe4b52644ca79" exitCode=0 Mar 19 00:32:31 crc kubenswrapper[4745]: I0319 00:32:31.623072 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"00084aa5-c66b-4c62-a0c3-422e3c02286a","Type":"ContainerDied","Data":"b4ec71f0453639d110d361d2fd4ca7b279631514bab4c9ba34ffe4b52644ca79"} Mar 19 00:32:32 crc kubenswrapper[4745]: I0319 00:32:32.630630 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cwrwg" podUID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerName="registry-server" containerID="cri-o://ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d" gracePeriod=2 Mar 19 00:32:32 crc kubenswrapper[4745]: I0319 00:32:32.900126 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.013057 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050383 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-run\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050448 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-proxy-ca-bundles\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050504 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-root\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050571 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-pull\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050594 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-system-configs\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050626 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-node-pullsecrets\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050650 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-push\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050675 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-blob-cache\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050711 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildworkdir\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050742 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-ca-bundles\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050763 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvpkr\" (UniqueName: \"kubernetes.io/projected/00084aa5-c66b-4c62-a0c3-422e3c02286a-kube-api-access-lvpkr\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050794 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildcachedir\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.051186 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.051499 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.052284 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.052328 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.052667 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.052779 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.054507 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.055076 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.057145 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.058329 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.058433 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00084aa5-c66b-4c62-a0c3-422e3c02286a-kube-api-access-lvpkr" (OuterVolumeSpecName: "kube-api-access-lvpkr") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "kube-api-access-lvpkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.061423 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152028 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-catalog-content\") pod \"197ef947-3c30-4a50-ade4-01f72410e5cf\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152178 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-utilities\") pod \"197ef947-3c30-4a50-ade4-01f72410e5cf\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152244 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4wz7\" (UniqueName: \"kubernetes.io/projected/197ef947-3c30-4a50-ade4-01f72410e5cf-kube-api-access-n4wz7\") pod \"197ef947-3c30-4a50-ade4-01f72410e5cf\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152526 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvpkr\" (UniqueName: \"kubernetes.io/projected/00084aa5-c66b-4c62-a0c3-422e3c02286a-kube-api-access-lvpkr\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152546 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152559 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152572 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152582 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152593 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152603 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152613 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152623 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152634 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152645 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152658 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.154034 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-utilities" (OuterVolumeSpecName: "utilities") pod "197ef947-3c30-4a50-ade4-01f72410e5cf" (UID: "197ef947-3c30-4a50-ade4-01f72410e5cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.156051 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/197ef947-3c30-4a50-ade4-01f72410e5cf-kube-api-access-n4wz7" (OuterVolumeSpecName: "kube-api-access-n4wz7") pod "197ef947-3c30-4a50-ade4-01f72410e5cf" (UID: "197ef947-3c30-4a50-ade4-01f72410e5cf"). InnerVolumeSpecName "kube-api-access-n4wz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.254445 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.254717 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4wz7\" (UniqueName: \"kubernetes.io/projected/197ef947-3c30-4a50-ade4-01f72410e5cf-kube-api-access-n4wz7\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.294450 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "197ef947-3c30-4a50-ade4-01f72410e5cf" (UID: "197ef947-3c30-4a50-ade4-01f72410e5cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.355760 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.638125 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"00084aa5-c66b-4c62-a0c3-422e3c02286a","Type":"ContainerDied","Data":"95124c5725c9d01c054231894a7b4772b99823bf7deea19e1ef7de9c84f0ca6e"} Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.638184 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95124c5725c9d01c054231894a7b4772b99823bf7deea19e1ef7de9c84f0ca6e" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.639417 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.640168 4745 generic.go:334] "Generic (PLEG): container finished" podID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerID="ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d" exitCode=0 Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.640203 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.640206 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwrwg" event={"ID":"197ef947-3c30-4a50-ade4-01f72410e5cf","Type":"ContainerDied","Data":"ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d"} Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.640295 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwrwg" event={"ID":"197ef947-3c30-4a50-ade4-01f72410e5cf","Type":"ContainerDied","Data":"9f4642b49f8bb7f219967f76570586d6870f2fbf5d9b181304387179ff6fbf3b"} Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.640317 4745 scope.go:117] "RemoveContainer" containerID="ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.665313 4745 scope.go:117] "RemoveContainer" containerID="fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.681145 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cwrwg"] Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.688279 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cwrwg"] Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.704722 4745 scope.go:117] "RemoveContainer" containerID="6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.720242 4745 scope.go:117] "RemoveContainer" containerID="ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d" Mar 19 00:32:33 crc kubenswrapper[4745]: E0319 00:32:33.720714 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d\": container with ID starting with ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d not found: ID does not exist" containerID="ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.720777 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d"} err="failed to get container status \"ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d\": rpc error: code = NotFound desc = could not find container \"ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d\": container with ID starting with ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d not found: ID does not exist" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.720810 4745 scope.go:117] "RemoveContainer" containerID="fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7" Mar 19 00:32:33 crc kubenswrapper[4745]: E0319 00:32:33.721270 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7\": container with ID starting with fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7 not found: ID does not exist" containerID="fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.721294 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7"} err="failed to get container status \"fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7\": rpc error: code = NotFound desc = could not find container \"fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7\": container with ID starting with fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7 not found: ID does not exist" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.721311 4745 scope.go:117] "RemoveContainer" containerID="6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8" Mar 19 00:32:33 crc kubenswrapper[4745]: E0319 00:32:33.721567 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8\": container with ID starting with 6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8 not found: ID does not exist" containerID="6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.721588 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8"} err="failed to get container status \"6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8\": rpc error: code = NotFound desc = could not find container \"6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8\": container with ID starting with 6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8 not found: ID does not exist" Mar 19 00:32:34 crc kubenswrapper[4745]: I0319 00:32:34.147158 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="197ef947-3c30-4a50-ade4-01f72410e5cf" path="/var/lib/kubelet/pods/197ef947-3c30-4a50-ade4-01f72410e5cf/volumes" Mar 19 00:32:35 crc kubenswrapper[4745]: I0319 00:32:35.471772 4745 scope.go:117] "RemoveContainer" containerID="6373326992b421fd82709a30cf5b66d6c20b4b21e2598084ed73e4aa5185678e" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.656196 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 19 00:32:36 crc kubenswrapper[4745]: E0319 00:32:36.656566 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerName="extract-utilities" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.656585 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerName="extract-utilities" Mar 19 00:32:36 crc kubenswrapper[4745]: E0319 00:32:36.656595 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00084aa5-c66b-4c62-a0c3-422e3c02286a" containerName="manage-dockerfile" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.656603 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="00084aa5-c66b-4c62-a0c3-422e3c02286a" containerName="manage-dockerfile" Mar 19 00:32:36 crc kubenswrapper[4745]: E0319 00:32:36.656617 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerName="extract-content" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.656630 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerName="extract-content" Mar 19 00:32:36 crc kubenswrapper[4745]: E0319 00:32:36.656648 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00084aa5-c66b-4c62-a0c3-422e3c02286a" containerName="git-clone" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.656656 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="00084aa5-c66b-4c62-a0c3-422e3c02286a" containerName="git-clone" Mar 19 00:32:36 crc kubenswrapper[4745]: E0319 00:32:36.656682 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerName="registry-server" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.656689 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerName="registry-server" Mar 19 00:32:36 crc kubenswrapper[4745]: E0319 00:32:36.656696 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00084aa5-c66b-4c62-a0c3-422e3c02286a" containerName="docker-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.656706 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="00084aa5-c66b-4c62-a0c3-422e3c02286a" containerName="docker-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.656838 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="00084aa5-c66b-4c62-a0c3-422e3c02286a" containerName="docker-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.656854 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerName="registry-server" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.657662 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.660407 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-ca" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.660463 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-sys-config" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.660974 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-global-ca" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.661325 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.677287 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.801928 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802023 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802053 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802113 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802145 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802186 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802206 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5pl2\" (UniqueName: \"kubernetes.io/projected/479407cb-fdef-474d-a564-881954a984db-kube-api-access-k5pl2\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802357 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802415 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802618 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802751 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802816 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.903819 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904375 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904425 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904450 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904474 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904502 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904553 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904579 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5pl2\" (UniqueName: \"kubernetes.io/projected/479407cb-fdef-474d-a564-881954a984db-kube-api-access-k5pl2\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904621 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904654 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904687 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904727 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904721 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.905104 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.905126 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.905180 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.905368 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.905375 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.906264 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.906321 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.906871 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.911585 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.917922 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.923640 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5pl2\" (UniqueName: \"kubernetes.io/projected/479407cb-fdef-474d-a564-881954a984db-kube-api-access-k5pl2\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.974583 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:37 crc kubenswrapper[4745]: I0319 00:32:37.184161 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 19 00:32:37 crc kubenswrapper[4745]: I0319 00:32:37.672756 4745 generic.go:334] "Generic (PLEG): container finished" podID="479407cb-fdef-474d-a564-881954a984db" containerID="41cdf9f33044f6a4909a9e2e26ad76fb6b92253759abe1d7140516760d28b75c" exitCode=0 Mar 19 00:32:37 crc kubenswrapper[4745]: I0319 00:32:37.672835 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"479407cb-fdef-474d-a564-881954a984db","Type":"ContainerDied","Data":"41cdf9f33044f6a4909a9e2e26ad76fb6b92253759abe1d7140516760d28b75c"} Mar 19 00:32:37 crc kubenswrapper[4745]: I0319 00:32:37.672921 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"479407cb-fdef-474d-a564-881954a984db","Type":"ContainerStarted","Data":"66300060985a6ce250cf2a184b02513f9286f734f0e799ca10c3c982703dd624"} Mar 19 00:32:38 crc kubenswrapper[4745]: I0319 00:32:38.682000 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_479407cb-fdef-474d-a564-881954a984db/docker-build/0.log" Mar 19 00:32:38 crc kubenswrapper[4745]: I0319 00:32:38.683036 4745 generic.go:334] "Generic (PLEG): container finished" podID="479407cb-fdef-474d-a564-881954a984db" containerID="3d3abcaceec0d44feeaa99fe5fc507d2939843b4e5f2688e33f19d72f84aabe1" exitCode=1 Mar 19 00:32:38 crc kubenswrapper[4745]: I0319 00:32:38.683097 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"479407cb-fdef-474d-a564-881954a984db","Type":"ContainerDied","Data":"3d3abcaceec0d44feeaa99fe5fc507d2939843b4e5f2688e33f19d72f84aabe1"} Mar 19 00:32:39 crc kubenswrapper[4745]: I0319 00:32:39.986796 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_479407cb-fdef-474d-a564-881954a984db/docker-build/0.log" Mar 19 00:32:39 crc kubenswrapper[4745]: I0319 00:32:39.987566 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052275 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-build-blob-cache\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052339 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-buildcachedir\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052361 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-buildworkdir\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052406 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5pl2\" (UniqueName: \"kubernetes.io/projected/479407cb-fdef-474d-a564-881954a984db-kube-api-access-k5pl2\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052428 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-run\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052426 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052648 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-push\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052675 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-root\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052702 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-pull\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052730 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-proxy-ca-bundles\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052755 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-node-pullsecrets\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052786 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-ca-bundles\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052840 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-system-configs\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052914 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.053192 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.053210 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.053249 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.053614 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.053653 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.053672 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.053807 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.054070 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.054413 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.058065 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.058070 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/479407cb-fdef-474d-a564-881954a984db-kube-api-access-k5pl2" (OuterVolumeSpecName: "kube-api-access-k5pl2") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "kube-api-access-k5pl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.058347 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.154102 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.154134 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.154143 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.154152 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.154162 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.154170 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.154178 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.154187 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.154196 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5pl2\" (UniqueName: \"kubernetes.io/projected/479407cb-fdef-474d-a564-881954a984db-kube-api-access-k5pl2\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.154206 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.698971 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_479407cb-fdef-474d-a564-881954a984db/docker-build/0.log" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.699853 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"479407cb-fdef-474d-a564-881954a984db","Type":"ContainerDied","Data":"66300060985a6ce250cf2a184b02513f9286f734f0e799ca10c3c982703dd624"} Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.699907 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66300060985a6ce250cf2a184b02513f9286f734f0e799ca10c3c982703dd624" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.699998 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:45 crc kubenswrapper[4745]: I0319 00:32:45.606707 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:32:45 crc kubenswrapper[4745]: I0319 00:32:45.608043 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:32:47 crc kubenswrapper[4745]: I0319 00:32:47.119316 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 19 00:32:47 crc kubenswrapper[4745]: I0319 00:32:47.128653 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.147286 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="479407cb-fdef-474d-a564-881954a984db" path="/var/lib/kubelet/pods/479407cb-fdef-474d-a564-881954a984db/volumes" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.824296 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 19 00:32:48 crc kubenswrapper[4745]: E0319 00:32:48.824633 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479407cb-fdef-474d-a564-881954a984db" containerName="manage-dockerfile" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.824651 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="479407cb-fdef-474d-a564-881954a984db" containerName="manage-dockerfile" Mar 19 00:32:48 crc kubenswrapper[4745]: E0319 00:32:48.824673 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479407cb-fdef-474d-a564-881954a984db" containerName="docker-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.824681 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="479407cb-fdef-474d-a564-881954a984db" containerName="docker-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.824843 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="479407cb-fdef-474d-a564-881954a984db" containerName="docker-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.825998 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.828987 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-sys-config" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.829104 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.829141 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-ca" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.829260 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-global-ca" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.850639 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.981754 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982315 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982367 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982389 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982415 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982435 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982467 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982489 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982514 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982554 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982604 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982627 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn2fw\" (UniqueName: \"kubernetes.io/projected/9faac743-5f49-4d6a-bb53-da4b1178ee26-kube-api-access-pn2fw\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084350 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084439 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084475 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn2fw\" (UniqueName: \"kubernetes.io/projected/9faac743-5f49-4d6a-bb53-da4b1178ee26-kube-api-access-pn2fw\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084503 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084526 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084566 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084590 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084616 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084634 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084656 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084672 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084688 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084953 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.085011 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.085117 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.085211 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.085345 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.085507 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.085561 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.085619 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.085927 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.091919 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.096451 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.109913 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn2fw\" (UniqueName: \"kubernetes.io/projected/9faac743-5f49-4d6a-bb53-da4b1178ee26-kube-api-access-pn2fw\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.149255 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.460277 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.787469 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"9faac743-5f49-4d6a-bb53-da4b1178ee26","Type":"ContainerStarted","Data":"fee07e53cb9d15670b053d3a0bff1e14a6fd7d3736db84b1b91153328d1b6b1f"} Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.787536 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"9faac743-5f49-4d6a-bb53-da4b1178ee26","Type":"ContainerStarted","Data":"f1794e32a011e5644ffc79544be3fbbd276268a0dd008ac9067dc718c52ae74d"} Mar 19 00:32:50 crc kubenswrapper[4745]: I0319 00:32:50.797347 4745 generic.go:334] "Generic (PLEG): container finished" podID="9faac743-5f49-4d6a-bb53-da4b1178ee26" containerID="fee07e53cb9d15670b053d3a0bff1e14a6fd7d3736db84b1b91153328d1b6b1f" exitCode=0 Mar 19 00:32:50 crc kubenswrapper[4745]: I0319 00:32:50.797435 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"9faac743-5f49-4d6a-bb53-da4b1178ee26","Type":"ContainerDied","Data":"fee07e53cb9d15670b053d3a0bff1e14a6fd7d3736db84b1b91153328d1b6b1f"} Mar 19 00:32:51 crc kubenswrapper[4745]: I0319 00:32:51.807735 4745 generic.go:334] "Generic (PLEG): container finished" podID="9faac743-5f49-4d6a-bb53-da4b1178ee26" containerID="f98cd947aae65c5442381e3256226970531d2fb07bf7814d884c6af3441f3d64" exitCode=0 Mar 19 00:32:51 crc kubenswrapper[4745]: I0319 00:32:51.807811 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"9faac743-5f49-4d6a-bb53-da4b1178ee26","Type":"ContainerDied","Data":"f98cd947aae65c5442381e3256226970531d2fb07bf7814d884c6af3441f3d64"} Mar 19 00:32:51 crc kubenswrapper[4745]: I0319 00:32:51.852169 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-2-build_9faac743-5f49-4d6a-bb53-da4b1178ee26/manage-dockerfile/0.log" Mar 19 00:32:52 crc kubenswrapper[4745]: I0319 00:32:52.817244 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"9faac743-5f49-4d6a-bb53-da4b1178ee26","Type":"ContainerStarted","Data":"fea6a727e51f4292705620c79681af6b9ca4114d156adcb01d758ba669b14e5e"} Mar 19 00:32:52 crc kubenswrapper[4745]: I0319 00:32:52.854355 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-bundle-2-build" podStartSLOduration=4.854327183 podStartE2EDuration="4.854327183s" podCreationTimestamp="2026-03-19 00:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:32:52.848368357 +0000 UTC m=+1537.386563488" watchObservedRunningTime="2026-03-19 00:32:52.854327183 +0000 UTC m=+1537.392522314" Mar 19 00:32:54 crc kubenswrapper[4745]: I0319 00:32:54.831659 4745 generic.go:334] "Generic (PLEG): container finished" podID="9faac743-5f49-4d6a-bb53-da4b1178ee26" containerID="fea6a727e51f4292705620c79681af6b9ca4114d156adcb01d758ba669b14e5e" exitCode=0 Mar 19 00:32:54 crc kubenswrapper[4745]: I0319 00:32:54.831856 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"9faac743-5f49-4d6a-bb53-da4b1178ee26","Type":"ContainerDied","Data":"fea6a727e51f4292705620c79681af6b9ca4114d156adcb01d758ba669b14e5e"} Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.095102 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196408 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildworkdir\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196477 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-ca-bundles\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196515 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-root\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196585 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-run\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196615 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-proxy-ca-bundles\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196641 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-system-configs\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196668 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildcachedir\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196688 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-node-pullsecrets\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196715 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-push\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196737 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-pull\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196760 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-blob-cache\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196781 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn2fw\" (UniqueName: \"kubernetes.io/projected/9faac743-5f49-4d6a-bb53-da4b1178ee26-kube-api-access-pn2fw\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.197059 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.197295 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.197824 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.198119 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.198363 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.198456 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.198610 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.200015 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.201627 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.203147 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9faac743-5f49-4d6a-bb53-da4b1178ee26-kube-api-access-pn2fw" (OuterVolumeSpecName: "kube-api-access-pn2fw") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "kube-api-access-pn2fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.204065 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.209118 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298046 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298090 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298101 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298114 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298123 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298131 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298140 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298150 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298162 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn2fw\" (UniqueName: \"kubernetes.io/projected/9faac743-5f49-4d6a-bb53-da4b1178ee26-kube-api-access-pn2fw\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298174 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298185 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298194 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.848928 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"9faac743-5f49-4d6a-bb53-da4b1178ee26","Type":"ContainerDied","Data":"f1794e32a011e5644ffc79544be3fbbd276268a0dd008ac9067dc718c52ae74d"} Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.848982 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1794e32a011e5644ffc79544be3fbbd276268a0dd008ac9067dc718c52ae74d" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.849021 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.981583 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 19 00:33:11 crc kubenswrapper[4745]: E0319 00:33:11.982567 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faac743-5f49-4d6a-bb53-da4b1178ee26" containerName="git-clone" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.982583 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faac743-5f49-4d6a-bb53-da4b1178ee26" containerName="git-clone" Mar 19 00:33:11 crc kubenswrapper[4745]: E0319 00:33:11.982615 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faac743-5f49-4d6a-bb53-da4b1178ee26" containerName="manage-dockerfile" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.982623 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faac743-5f49-4d6a-bb53-da4b1178ee26" containerName="manage-dockerfile" Mar 19 00:33:11 crc kubenswrapper[4745]: E0319 00:33:11.982633 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faac743-5f49-4d6a-bb53-da4b1178ee26" containerName="docker-build" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.982641 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faac743-5f49-4d6a-bb53-da4b1178ee26" containerName="docker-build" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.982775 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="9faac743-5f49-4d6a-bb53-da4b1178ee26" containerName="docker-build" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.983714 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.985730 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.985735 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-global-ca" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.986318 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-sys-config" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.986701 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-index-dockercfg" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.988678 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-ca" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.006539 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137132 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137211 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137240 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frfhv\" (UniqueName: \"kubernetes.io/projected/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-kube-api-access-frfhv\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137269 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137300 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137334 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137366 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137397 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137424 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137486 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137521 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137548 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137576 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239278 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239376 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239412 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239436 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239460 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239481 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239564 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239594 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239611 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239632 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239660 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239686 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239703 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frfhv\" (UniqueName: \"kubernetes.io/projected/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-kube-api-access-frfhv\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.240023 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.240136 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.240218 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.240381 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.240437 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.240891 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.241181 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.241218 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.241325 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.246294 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.246318 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.251509 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.260180 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frfhv\" (UniqueName: \"kubernetes.io/projected/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-kube-api-access-frfhv\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.308658 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.523470 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.971740 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"047a89c0-d73e-49d9-bb4e-b01fcefe54a6","Type":"ContainerStarted","Data":"d9edc2ec4c7404182e0b8c14859b05a93541996296f20b1e6a8161d5fc3e98fc"} Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.972250 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"047a89c0-d73e-49d9-bb4e-b01fcefe54a6","Type":"ContainerStarted","Data":"56d6c6c471c42b322ae3628ab1707980810345401b838e1eae669d5f0a071ce1"} Mar 19 00:33:13 crc kubenswrapper[4745]: I0319 00:33:13.979419 4745 generic.go:334] "Generic (PLEG): container finished" podID="047a89c0-d73e-49d9-bb4e-b01fcefe54a6" containerID="d9edc2ec4c7404182e0b8c14859b05a93541996296f20b1e6a8161d5fc3e98fc" exitCode=0 Mar 19 00:33:13 crc kubenswrapper[4745]: I0319 00:33:13.979479 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"047a89c0-d73e-49d9-bb4e-b01fcefe54a6","Type":"ContainerDied","Data":"d9edc2ec4c7404182e0b8c14859b05a93541996296f20b1e6a8161d5fc3e98fc"} Mar 19 00:33:14 crc kubenswrapper[4745]: I0319 00:33:14.986832 4745 generic.go:334] "Generic (PLEG): container finished" podID="047a89c0-d73e-49d9-bb4e-b01fcefe54a6" containerID="e53bb767b00c8d5bf2fa027d616f337147f469239d5f1158a1e3956e946eeab2" exitCode=0 Mar 19 00:33:14 crc kubenswrapper[4745]: I0319 00:33:14.986937 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"047a89c0-d73e-49d9-bb4e-b01fcefe54a6","Type":"ContainerDied","Data":"e53bb767b00c8d5bf2fa027d616f337147f469239d5f1158a1e3956e946eeab2"} Mar 19 00:33:15 crc kubenswrapper[4745]: I0319 00:33:15.041764 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-1-build_047a89c0-d73e-49d9-bb4e-b01fcefe54a6/manage-dockerfile/0.log" Mar 19 00:33:15 crc kubenswrapper[4745]: I0319 00:33:15.606229 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:33:15 crc kubenswrapper[4745]: I0319 00:33:15.606748 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:33:15 crc kubenswrapper[4745]: I0319 00:33:15.998934 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"047a89c0-d73e-49d9-bb4e-b01fcefe54a6","Type":"ContainerStarted","Data":"835795f6ed7df0d4af5545ce929b9f07a201501b5f1a75e3ad659c65d69b3147"} Mar 19 00:33:16 crc kubenswrapper[4745]: I0319 00:33:16.028466 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-index-1-build" podStartSLOduration=5.028444837 podStartE2EDuration="5.028444837s" podCreationTimestamp="2026-03-19 00:33:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:33:16.027149306 +0000 UTC m=+1560.565344437" watchObservedRunningTime="2026-03-19 00:33:16.028444837 +0000 UTC m=+1560.566639958" Mar 19 00:33:44 crc kubenswrapper[4745]: I0319 00:33:44.203566 4745 generic.go:334] "Generic (PLEG): container finished" podID="047a89c0-d73e-49d9-bb4e-b01fcefe54a6" containerID="835795f6ed7df0d4af5545ce929b9f07a201501b5f1a75e3ad659c65d69b3147" exitCode=0 Mar 19 00:33:44 crc kubenswrapper[4745]: I0319 00:33:44.203676 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"047a89c0-d73e-49d9-bb4e-b01fcefe54a6","Type":"ContainerDied","Data":"835795f6ed7df0d4af5545ce929b9f07a201501b5f1a75e3ad659c65d69b3147"} Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.557507 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.606167 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.606248 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.606315 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.607136 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2"} pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.607208 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" containerID="cri-o://14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" gracePeriod=600 Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.650178 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-system-configs\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.650231 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-node-pullsecrets\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.650321 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildcachedir\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.650349 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.650390 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-root\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.650445 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.650529 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.650458 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildworkdir\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.651186 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.651284 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-run\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.651344 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frfhv\" (UniqueName: \"kubernetes.io/projected/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-kube-api-access-frfhv\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.651390 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-push\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.651441 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-pull\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.651491 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-blob-cache\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.651544 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-ca-bundles\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.651608 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-proxy-ca-bundles\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.652260 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.652288 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.652306 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.652616 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.652736 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.653182 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.656160 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.657456 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.657518 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-kube-api-access-frfhv" (OuterVolumeSpecName: "kube-api-access-frfhv") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "kube-api-access-frfhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.658023 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.660042 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: E0319 00:33:45.739380 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.753401 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.753437 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frfhv\" (UniqueName: \"kubernetes.io/projected/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-kube-api-access-frfhv\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.753447 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.753460 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.753470 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.753479 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.753490 4745 reconciler_common.go:293] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.753502 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.921403 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.956617 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:46 crc kubenswrapper[4745]: I0319 00:33:46.223757 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"047a89c0-d73e-49d9-bb4e-b01fcefe54a6","Type":"ContainerDied","Data":"56d6c6c471c42b322ae3628ab1707980810345401b838e1eae669d5f0a071ce1"} Mar 19 00:33:46 crc kubenswrapper[4745]: I0319 00:33:46.223829 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56d6c6c471c42b322ae3628ab1707980810345401b838e1eae669d5f0a071ce1" Mar 19 00:33:46 crc kubenswrapper[4745]: I0319 00:33:46.223776 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:46 crc kubenswrapper[4745]: I0319 00:33:46.226031 4745 generic.go:334] "Generic (PLEG): container finished" podID="400972f4-050f-4f26-b982-ced6f2590c8b" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" exitCode=0 Mar 19 00:33:46 crc kubenswrapper[4745]: I0319 00:33:46.226079 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerDied","Data":"14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2"} Mar 19 00:33:46 crc kubenswrapper[4745]: I0319 00:33:46.226127 4745 scope.go:117] "RemoveContainer" containerID="596b4f8d9ce25b5331069b1483120c23162574341fd13a90efeccafcfe8087f5" Mar 19 00:33:46 crc kubenswrapper[4745]: I0319 00:33:46.226474 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:33:46 crc kubenswrapper[4745]: E0319 00:33:46.226732 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:33:46 crc kubenswrapper[4745]: I0319 00:33:46.602011 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:33:46 crc kubenswrapper[4745]: I0319 00:33:46.671041 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.114663 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-thzc6"] Mar 19 00:33:48 crc kubenswrapper[4745]: E0319 00:33:48.115068 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047a89c0-d73e-49d9-bb4e-b01fcefe54a6" containerName="manage-dockerfile" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.115086 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="047a89c0-d73e-49d9-bb4e-b01fcefe54a6" containerName="manage-dockerfile" Mar 19 00:33:48 crc kubenswrapper[4745]: E0319 00:33:48.115104 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047a89c0-d73e-49d9-bb4e-b01fcefe54a6" containerName="docker-build" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.115114 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="047a89c0-d73e-49d9-bb4e-b01fcefe54a6" containerName="docker-build" Mar 19 00:33:48 crc kubenswrapper[4745]: E0319 00:33:48.115135 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047a89c0-d73e-49d9-bb4e-b01fcefe54a6" containerName="git-clone" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.115146 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="047a89c0-d73e-49d9-bb4e-b01fcefe54a6" containerName="git-clone" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.115289 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="047a89c0-d73e-49d9-bb4e-b01fcefe54a6" containerName="docker-build" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.116018 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-thzc6" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.119084 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"infrawatch-operators-dockercfg-f8hd8" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.124850 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-thzc6"] Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.195085 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj2tv\" (UniqueName: \"kubernetes.io/projected/d546c548-8ccd-4a8f-b790-7ba7e7340939-kube-api-access-hj2tv\") pod \"infrawatch-operators-thzc6\" (UID: \"d546c548-8ccd-4a8f-b790-7ba7e7340939\") " pod="service-telemetry/infrawatch-operators-thzc6" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.296322 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj2tv\" (UniqueName: \"kubernetes.io/projected/d546c548-8ccd-4a8f-b790-7ba7e7340939-kube-api-access-hj2tv\") pod \"infrawatch-operators-thzc6\" (UID: \"d546c548-8ccd-4a8f-b790-7ba7e7340939\") " pod="service-telemetry/infrawatch-operators-thzc6" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.317998 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj2tv\" (UniqueName: \"kubernetes.io/projected/d546c548-8ccd-4a8f-b790-7ba7e7340939-kube-api-access-hj2tv\") pod \"infrawatch-operators-thzc6\" (UID: \"d546c548-8ccd-4a8f-b790-7ba7e7340939\") " pod="service-telemetry/infrawatch-operators-thzc6" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.449071 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-thzc6" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.679406 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-thzc6"] Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.686496 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 00:33:49 crc kubenswrapper[4745]: I0319 00:33:49.255948 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-thzc6" event={"ID":"d546c548-8ccd-4a8f-b790-7ba7e7340939","Type":"ContainerStarted","Data":"20f2ca013f9548cfbe6b25313aa9e1d6ca51c45681defe7c66c725315be6f45b"} Mar 19 00:33:50 crc kubenswrapper[4745]: I0319 00:33:50.697504 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-thzc6"] Mar 19 00:33:50 crc kubenswrapper[4745]: I0319 00:33:50.915106 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-mrqcc"] Mar 19 00:33:50 crc kubenswrapper[4745]: I0319 00:33:50.916691 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-mrqcc" Mar 19 00:33:50 crc kubenswrapper[4745]: I0319 00:33:50.925265 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-mrqcc"] Mar 19 00:33:51 crc kubenswrapper[4745]: I0319 00:33:51.039850 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxwjc\" (UniqueName: \"kubernetes.io/projected/9d01cbfd-53e4-4afc-8cc9-00f2b1694c2b-kube-api-access-nxwjc\") pod \"infrawatch-operators-mrqcc\" (UID: \"9d01cbfd-53e4-4afc-8cc9-00f2b1694c2b\") " pod="service-telemetry/infrawatch-operators-mrqcc" Mar 19 00:33:51 crc kubenswrapper[4745]: I0319 00:33:51.142187 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxwjc\" (UniqueName: \"kubernetes.io/projected/9d01cbfd-53e4-4afc-8cc9-00f2b1694c2b-kube-api-access-nxwjc\") pod \"infrawatch-operators-mrqcc\" (UID: \"9d01cbfd-53e4-4afc-8cc9-00f2b1694c2b\") " pod="service-telemetry/infrawatch-operators-mrqcc" Mar 19 00:33:51 crc kubenswrapper[4745]: I0319 00:33:51.168930 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxwjc\" (UniqueName: \"kubernetes.io/projected/9d01cbfd-53e4-4afc-8cc9-00f2b1694c2b-kube-api-access-nxwjc\") pod \"infrawatch-operators-mrqcc\" (UID: \"9d01cbfd-53e4-4afc-8cc9-00f2b1694c2b\") " pod="service-telemetry/infrawatch-operators-mrqcc" Mar 19 00:33:51 crc kubenswrapper[4745]: I0319 00:33:51.241312 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-mrqcc" Mar 19 00:33:51 crc kubenswrapper[4745]: I0319 00:33:51.670501 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-mrqcc"] Mar 19 00:33:54 crc kubenswrapper[4745]: I0319 00:33:54.300022 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-mrqcc" event={"ID":"9d01cbfd-53e4-4afc-8cc9-00f2b1694c2b","Type":"ContainerStarted","Data":"cb67907a7b34baccdc293c4638d2b2130457235999263af3e6d8ae28aa5a13b2"} Mar 19 00:34:00 crc kubenswrapper[4745]: I0319 00:34:00.147938 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564674-2zhcp"] Mar 19 00:34:00 crc kubenswrapper[4745]: I0319 00:34:00.149426 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564674-2zhcp" Mar 19 00:34:00 crc kubenswrapper[4745]: I0319 00:34:00.150844 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564674-2zhcp"] Mar 19 00:34:00 crc kubenswrapper[4745]: I0319 00:34:00.154424 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:34:00 crc kubenswrapper[4745]: I0319 00:34:00.154515 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:34:00 crc kubenswrapper[4745]: I0319 00:34:00.154694 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:34:00 crc kubenswrapper[4745]: I0319 00:34:00.181446 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nktw2\" (UniqueName: \"kubernetes.io/projected/c9fad81f-d73f-4e01-9a07-66b20741533e-kube-api-access-nktw2\") pod \"auto-csr-approver-29564674-2zhcp\" (UID: \"c9fad81f-d73f-4e01-9a07-66b20741533e\") " pod="openshift-infra/auto-csr-approver-29564674-2zhcp" Mar 19 00:34:00 crc kubenswrapper[4745]: I0319 00:34:00.282590 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nktw2\" (UniqueName: \"kubernetes.io/projected/c9fad81f-d73f-4e01-9a07-66b20741533e-kube-api-access-nktw2\") pod \"auto-csr-approver-29564674-2zhcp\" (UID: \"c9fad81f-d73f-4e01-9a07-66b20741533e\") " pod="openshift-infra/auto-csr-approver-29564674-2zhcp" Mar 19 00:34:00 crc kubenswrapper[4745]: I0319 00:34:00.305596 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nktw2\" (UniqueName: \"kubernetes.io/projected/c9fad81f-d73f-4e01-9a07-66b20741533e-kube-api-access-nktw2\") pod \"auto-csr-approver-29564674-2zhcp\" (UID: \"c9fad81f-d73f-4e01-9a07-66b20741533e\") " pod="openshift-infra/auto-csr-approver-29564674-2zhcp" Mar 19 00:34:00 crc kubenswrapper[4745]: I0319 00:34:00.473229 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564674-2zhcp" Mar 19 00:34:01 crc kubenswrapper[4745]: I0319 00:34:01.138350 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:34:01 crc kubenswrapper[4745]: E0319 00:34:01.139620 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:34:04 crc kubenswrapper[4745]: E0319 00:34:04.034449 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest" Mar 19 00:34:04 crc kubenswrapper[4745]: E0319 00:34:04.035187 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hj2tv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infrawatch-operators-thzc6_service-telemetry(d546c548-8ccd-4a8f-b790-7ba7e7340939): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 00:34:04 crc kubenswrapper[4745]: E0319 00:34:04.036466 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/infrawatch-operators-thzc6" podUID="d546c548-8ccd-4a8f-b790-7ba7e7340939" Mar 19 00:34:04 crc kubenswrapper[4745]: I0319 00:34:04.383502 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-mrqcc" event={"ID":"9d01cbfd-53e4-4afc-8cc9-00f2b1694c2b","Type":"ContainerStarted","Data":"d8b30130338c71d17fef775727bdedeb621de7a2d31a5998975d589b9409aec7"} Mar 19 00:34:04 crc kubenswrapper[4745]: I0319 00:34:04.394816 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564674-2zhcp"] Mar 19 00:34:04 crc kubenswrapper[4745]: I0319 00:34:04.441029 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-mrqcc" podStartSLOduration=4.488774318 podStartE2EDuration="14.441001424s" podCreationTimestamp="2026-03-19 00:33:50 +0000 UTC" firstStartedPulling="2026-03-19 00:33:54.122388973 +0000 UTC m=+1598.660584104" lastFinishedPulling="2026-03-19 00:34:04.074616079 +0000 UTC m=+1608.612811210" observedRunningTime="2026-03-19 00:34:04.433635725 +0000 UTC m=+1608.971830876" watchObservedRunningTime="2026-03-19 00:34:04.441001424 +0000 UTC m=+1608.979196555" Mar 19 00:34:04 crc kubenswrapper[4745]: I0319 00:34:04.604713 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-thzc6" Mar 19 00:34:04 crc kubenswrapper[4745]: I0319 00:34:04.649915 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj2tv\" (UniqueName: \"kubernetes.io/projected/d546c548-8ccd-4a8f-b790-7ba7e7340939-kube-api-access-hj2tv\") pod \"d546c548-8ccd-4a8f-b790-7ba7e7340939\" (UID: \"d546c548-8ccd-4a8f-b790-7ba7e7340939\") " Mar 19 00:34:04 crc kubenswrapper[4745]: I0319 00:34:04.656124 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d546c548-8ccd-4a8f-b790-7ba7e7340939-kube-api-access-hj2tv" (OuterVolumeSpecName: "kube-api-access-hj2tv") pod "d546c548-8ccd-4a8f-b790-7ba7e7340939" (UID: "d546c548-8ccd-4a8f-b790-7ba7e7340939"). InnerVolumeSpecName "kube-api-access-hj2tv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:34:04 crc kubenswrapper[4745]: I0319 00:34:04.751667 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj2tv\" (UniqueName: \"kubernetes.io/projected/d546c548-8ccd-4a8f-b790-7ba7e7340939-kube-api-access-hj2tv\") on node \"crc\" DevicePath \"\"" Mar 19 00:34:05 crc kubenswrapper[4745]: I0319 00:34:05.396397 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564674-2zhcp" event={"ID":"c9fad81f-d73f-4e01-9a07-66b20741533e","Type":"ContainerStarted","Data":"77c6e8fd219209e3c00449d14e53d4b561cf95d96fc04068d5c4a90d228c1965"} Mar 19 00:34:05 crc kubenswrapper[4745]: I0319 00:34:05.399334 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-thzc6" event={"ID":"d546c548-8ccd-4a8f-b790-7ba7e7340939","Type":"ContainerDied","Data":"20f2ca013f9548cfbe6b25313aa9e1d6ca51c45681defe7c66c725315be6f45b"} Mar 19 00:34:05 crc kubenswrapper[4745]: I0319 00:34:05.399367 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-thzc6" Mar 19 00:34:05 crc kubenswrapper[4745]: I0319 00:34:05.482120 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-thzc6"] Mar 19 00:34:05 crc kubenswrapper[4745]: I0319 00:34:05.501348 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-thzc6"] Mar 19 00:34:06 crc kubenswrapper[4745]: I0319 00:34:06.149632 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d546c548-8ccd-4a8f-b790-7ba7e7340939" path="/var/lib/kubelet/pods/d546c548-8ccd-4a8f-b790-7ba7e7340939/volumes" Mar 19 00:34:06 crc kubenswrapper[4745]: I0319 00:34:06.409299 4745 generic.go:334] "Generic (PLEG): container finished" podID="c9fad81f-d73f-4e01-9a07-66b20741533e" containerID="6316938ec69e0de5c06031d730ac5d55047c4061f5dea80ee35cf89954f73a68" exitCode=0 Mar 19 00:34:06 crc kubenswrapper[4745]: I0319 00:34:06.409367 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564674-2zhcp" event={"ID":"c9fad81f-d73f-4e01-9a07-66b20741533e","Type":"ContainerDied","Data":"6316938ec69e0de5c06031d730ac5d55047c4061f5dea80ee35cf89954f73a68"} Mar 19 00:34:07 crc kubenswrapper[4745]: I0319 00:34:07.699391 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564674-2zhcp" Mar 19 00:34:07 crc kubenswrapper[4745]: I0319 00:34:07.798742 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nktw2\" (UniqueName: \"kubernetes.io/projected/c9fad81f-d73f-4e01-9a07-66b20741533e-kube-api-access-nktw2\") pod \"c9fad81f-d73f-4e01-9a07-66b20741533e\" (UID: \"c9fad81f-d73f-4e01-9a07-66b20741533e\") " Mar 19 00:34:07 crc kubenswrapper[4745]: I0319 00:34:07.806220 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9fad81f-d73f-4e01-9a07-66b20741533e-kube-api-access-nktw2" (OuterVolumeSpecName: "kube-api-access-nktw2") pod "c9fad81f-d73f-4e01-9a07-66b20741533e" (UID: "c9fad81f-d73f-4e01-9a07-66b20741533e"). InnerVolumeSpecName "kube-api-access-nktw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:34:07 crc kubenswrapper[4745]: I0319 00:34:07.900994 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nktw2\" (UniqueName: \"kubernetes.io/projected/c9fad81f-d73f-4e01-9a07-66b20741533e-kube-api-access-nktw2\") on node \"crc\" DevicePath \"\"" Mar 19 00:34:08 crc kubenswrapper[4745]: I0319 00:34:08.426932 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564674-2zhcp" event={"ID":"c9fad81f-d73f-4e01-9a07-66b20741533e","Type":"ContainerDied","Data":"77c6e8fd219209e3c00449d14e53d4b561cf95d96fc04068d5c4a90d228c1965"} Mar 19 00:34:08 crc kubenswrapper[4745]: I0319 00:34:08.427340 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77c6e8fd219209e3c00449d14e53d4b561cf95d96fc04068d5c4a90d228c1965" Mar 19 00:34:08 crc kubenswrapper[4745]: I0319 00:34:08.427028 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564674-2zhcp" Mar 19 00:34:08 crc kubenswrapper[4745]: I0319 00:34:08.771314 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564668-66fs6"] Mar 19 00:34:08 crc kubenswrapper[4745]: I0319 00:34:08.775866 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564668-66fs6"] Mar 19 00:34:10 crc kubenswrapper[4745]: I0319 00:34:10.147323 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9ad0116-35eb-40db-8d57-4501affdf59c" path="/var/lib/kubelet/pods/c9ad0116-35eb-40db-8d57-4501affdf59c/volumes" Mar 19 00:34:11 crc kubenswrapper[4745]: I0319 00:34:11.241933 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-mrqcc" Mar 19 00:34:11 crc kubenswrapper[4745]: I0319 00:34:11.243028 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-mrqcc" Mar 19 00:34:11 crc kubenswrapper[4745]: I0319 00:34:11.274739 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-mrqcc" Mar 19 00:34:11 crc kubenswrapper[4745]: I0319 00:34:11.475432 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-mrqcc" Mar 19 00:34:15 crc kubenswrapper[4745]: I0319 00:34:15.138264 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:34:15 crc kubenswrapper[4745]: E0319 00:34:15.138917 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.549922 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24"] Mar 19 00:34:23 crc kubenswrapper[4745]: E0319 00:34:23.551045 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9fad81f-d73f-4e01-9a07-66b20741533e" containerName="oc" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.551067 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9fad81f-d73f-4e01-9a07-66b20741533e" containerName="oc" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.551195 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9fad81f-d73f-4e01-9a07-66b20741533e" containerName="oc" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.552259 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.564074 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24"] Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.722240 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqr4g\" (UniqueName: \"kubernetes.io/projected/61a70d11-ba59-4ad2-8427-c28882835ad6-kube-api-access-zqr4g\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.722742 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.722955 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.824411 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqr4g\" (UniqueName: \"kubernetes.io/projected/61a70d11-ba59-4ad2-8427-c28882835ad6-kube-api-access-zqr4g\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.824471 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.824560 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.825165 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.825324 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.849549 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqr4g\" (UniqueName: \"kubernetes.io/projected/61a70d11-ba59-4ad2-8427-c28882835ad6-kube-api-access-zqr4g\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.869490 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.326708 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24"] Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.370477 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4"] Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.372239 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.389235 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4"] Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.543919 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.544000 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25dr4\" (UniqueName: \"kubernetes.io/projected/34f163ba-1cf2-47e5-847f-8db4eac30c29-kube-api-access-25dr4\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.544061 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.545007 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" event={"ID":"61a70d11-ba59-4ad2-8427-c28882835ad6","Type":"ContainerStarted","Data":"5e3d52819cccf117ee2102d83ab5a41d16baa92ad9b71c6b5093655e59ea25aa"} Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.645405 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.645529 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25dr4\" (UniqueName: \"kubernetes.io/projected/34f163ba-1cf2-47e5-847f-8db4eac30c29-kube-api-access-25dr4\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.645741 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.646623 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.646725 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.670046 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25dr4\" (UniqueName: \"kubernetes.io/projected/34f163ba-1cf2-47e5-847f-8db4eac30c29-kube-api-access-25dr4\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.745405 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.972506 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4"] Mar 19 00:34:25 crc kubenswrapper[4745]: I0319 00:34:25.555939 4745 generic.go:334] "Generic (PLEG): container finished" podID="34f163ba-1cf2-47e5-847f-8db4eac30c29" containerID="db1da1c178895583cbbeb02f9a019334c12324d85bb84b0829090b6fec562ec1" exitCode=0 Mar 19 00:34:25 crc kubenswrapper[4745]: I0319 00:34:25.556066 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" event={"ID":"34f163ba-1cf2-47e5-847f-8db4eac30c29","Type":"ContainerDied","Data":"db1da1c178895583cbbeb02f9a019334c12324d85bb84b0829090b6fec562ec1"} Mar 19 00:34:25 crc kubenswrapper[4745]: I0319 00:34:25.556490 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" event={"ID":"34f163ba-1cf2-47e5-847f-8db4eac30c29","Type":"ContainerStarted","Data":"221db3a012c61b898efbb2032e252e9399d8603ae3c537e79d9ed30671636fc7"} Mar 19 00:34:25 crc kubenswrapper[4745]: I0319 00:34:25.560451 4745 generic.go:334] "Generic (PLEG): container finished" podID="61a70d11-ba59-4ad2-8427-c28882835ad6" containerID="7aa2fb188e8dbc02204f77cd7f122296d6d12e090d081261a29dbfdb2dfd8a37" exitCode=0 Mar 19 00:34:25 crc kubenswrapper[4745]: I0319 00:34:25.560508 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" event={"ID":"61a70d11-ba59-4ad2-8427-c28882835ad6","Type":"ContainerDied","Data":"7aa2fb188e8dbc02204f77cd7f122296d6d12e090d081261a29dbfdb2dfd8a37"} Mar 19 00:34:26 crc kubenswrapper[4745]: I0319 00:34:26.569350 4745 generic.go:334] "Generic (PLEG): container finished" podID="61a70d11-ba59-4ad2-8427-c28882835ad6" containerID="18c9701b802a7d35818e5a8247b67c03d39b001d0d53ccd381820ef55a41674a" exitCode=0 Mar 19 00:34:26 crc kubenswrapper[4745]: I0319 00:34:26.569726 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" event={"ID":"61a70d11-ba59-4ad2-8427-c28882835ad6","Type":"ContainerDied","Data":"18c9701b802a7d35818e5a8247b67c03d39b001d0d53ccd381820ef55a41674a"} Mar 19 00:34:26 crc kubenswrapper[4745]: I0319 00:34:26.571663 4745 generic.go:334] "Generic (PLEG): container finished" podID="34f163ba-1cf2-47e5-847f-8db4eac30c29" containerID="5846b2d515174b602b71e1177491af56a1f227f0410de497e5d9d1922b6a45fa" exitCode=0 Mar 19 00:34:26 crc kubenswrapper[4745]: I0319 00:34:26.571711 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" event={"ID":"34f163ba-1cf2-47e5-847f-8db4eac30c29","Type":"ContainerDied","Data":"5846b2d515174b602b71e1177491af56a1f227f0410de497e5d9d1922b6a45fa"} Mar 19 00:34:27 crc kubenswrapper[4745]: I0319 00:34:27.582563 4745 generic.go:334] "Generic (PLEG): container finished" podID="34f163ba-1cf2-47e5-847f-8db4eac30c29" containerID="a99982be9fea52e3d27bfed2ebd1ecfe164ace8fa077144182d10a2dd4a2c8d3" exitCode=0 Mar 19 00:34:27 crc kubenswrapper[4745]: I0319 00:34:27.582683 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" event={"ID":"34f163ba-1cf2-47e5-847f-8db4eac30c29","Type":"ContainerDied","Data":"a99982be9fea52e3d27bfed2ebd1ecfe164ace8fa077144182d10a2dd4a2c8d3"} Mar 19 00:34:27 crc kubenswrapper[4745]: I0319 00:34:27.585618 4745 generic.go:334] "Generic (PLEG): container finished" podID="61a70d11-ba59-4ad2-8427-c28882835ad6" containerID="63096d30197409b38b02c27f3376d5ad2b7f979623cb77957e05fe00ffc5827a" exitCode=0 Mar 19 00:34:27 crc kubenswrapper[4745]: I0319 00:34:27.585657 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" event={"ID":"61a70d11-ba59-4ad2-8427-c28882835ad6","Type":"ContainerDied","Data":"63096d30197409b38b02c27f3376d5ad2b7f979623cb77957e05fe00ffc5827a"} Mar 19 00:34:28 crc kubenswrapper[4745]: I0319 00:34:28.138753 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:34:28 crc kubenswrapper[4745]: E0319 00:34:28.139434 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:34:28 crc kubenswrapper[4745]: I0319 00:34:28.880317 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:28 crc kubenswrapper[4745]: I0319 00:34:28.885354 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.010228 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-bundle\") pod \"34f163ba-1cf2-47e5-847f-8db4eac30c29\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.010657 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-util\") pod \"61a70d11-ba59-4ad2-8427-c28882835ad6\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.010691 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqr4g\" (UniqueName: \"kubernetes.io/projected/61a70d11-ba59-4ad2-8427-c28882835ad6-kube-api-access-zqr4g\") pod \"61a70d11-ba59-4ad2-8427-c28882835ad6\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.010739 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-util\") pod \"34f163ba-1cf2-47e5-847f-8db4eac30c29\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.010822 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-bundle\") pod \"61a70d11-ba59-4ad2-8427-c28882835ad6\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.010875 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25dr4\" (UniqueName: \"kubernetes.io/projected/34f163ba-1cf2-47e5-847f-8db4eac30c29-kube-api-access-25dr4\") pod \"34f163ba-1cf2-47e5-847f-8db4eac30c29\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.011798 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-bundle" (OuterVolumeSpecName: "bundle") pod "61a70d11-ba59-4ad2-8427-c28882835ad6" (UID: "61a70d11-ba59-4ad2-8427-c28882835ad6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.011814 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-bundle" (OuterVolumeSpecName: "bundle") pod "34f163ba-1cf2-47e5-847f-8db4eac30c29" (UID: "34f163ba-1cf2-47e5-847f-8db4eac30c29"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.012357 4745 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.012383 4745 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.021130 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34f163ba-1cf2-47e5-847f-8db4eac30c29-kube-api-access-25dr4" (OuterVolumeSpecName: "kube-api-access-25dr4") pod "34f163ba-1cf2-47e5-847f-8db4eac30c29" (UID: "34f163ba-1cf2-47e5-847f-8db4eac30c29"). InnerVolumeSpecName "kube-api-access-25dr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.021189 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61a70d11-ba59-4ad2-8427-c28882835ad6-kube-api-access-zqr4g" (OuterVolumeSpecName: "kube-api-access-zqr4g") pod "61a70d11-ba59-4ad2-8427-c28882835ad6" (UID: "61a70d11-ba59-4ad2-8427-c28882835ad6"). InnerVolumeSpecName "kube-api-access-zqr4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.026489 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-util" (OuterVolumeSpecName: "util") pod "34f163ba-1cf2-47e5-847f-8db4eac30c29" (UID: "34f163ba-1cf2-47e5-847f-8db4eac30c29"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.026550 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-util" (OuterVolumeSpecName: "util") pod "61a70d11-ba59-4ad2-8427-c28882835ad6" (UID: "61a70d11-ba59-4ad2-8427-c28882835ad6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.113698 4745 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-util\") on node \"crc\" DevicePath \"\"" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.113733 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqr4g\" (UniqueName: \"kubernetes.io/projected/61a70d11-ba59-4ad2-8427-c28882835ad6-kube-api-access-zqr4g\") on node \"crc\" DevicePath \"\"" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.113742 4745 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-util\") on node \"crc\" DevicePath \"\"" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.113753 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25dr4\" (UniqueName: \"kubernetes.io/projected/34f163ba-1cf2-47e5-847f-8db4eac30c29-kube-api-access-25dr4\") on node \"crc\" DevicePath \"\"" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.603839 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" event={"ID":"61a70d11-ba59-4ad2-8427-c28882835ad6","Type":"ContainerDied","Data":"5e3d52819cccf117ee2102d83ab5a41d16baa92ad9b71c6b5093655e59ea25aa"} Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.604282 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e3d52819cccf117ee2102d83ab5a41d16baa92ad9b71c6b5093655e59ea25aa" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.603899 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.605994 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" event={"ID":"34f163ba-1cf2-47e5-847f-8db4eac30c29","Type":"ContainerDied","Data":"221db3a012c61b898efbb2032e252e9399d8603ae3c537e79d9ed30671636fc7"} Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.606038 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="221db3a012c61b898efbb2032e252e9399d8603ae3c537e79d9ed30671636fc7" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.606124 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.046277 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4"] Mar 19 00:34:34 crc kubenswrapper[4745]: E0319 00:34:34.047009 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a70d11-ba59-4ad2-8427-c28882835ad6" containerName="pull" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.047024 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a70d11-ba59-4ad2-8427-c28882835ad6" containerName="pull" Mar 19 00:34:34 crc kubenswrapper[4745]: E0319 00:34:34.047033 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a70d11-ba59-4ad2-8427-c28882835ad6" containerName="util" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.047039 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a70d11-ba59-4ad2-8427-c28882835ad6" containerName="util" Mar 19 00:34:34 crc kubenswrapper[4745]: E0319 00:34:34.047058 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f163ba-1cf2-47e5-847f-8db4eac30c29" containerName="pull" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.047064 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f163ba-1cf2-47e5-847f-8db4eac30c29" containerName="pull" Mar 19 00:34:34 crc kubenswrapper[4745]: E0319 00:34:34.047073 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f163ba-1cf2-47e5-847f-8db4eac30c29" containerName="extract" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.047079 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f163ba-1cf2-47e5-847f-8db4eac30c29" containerName="extract" Mar 19 00:34:34 crc kubenswrapper[4745]: E0319 00:34:34.047090 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a70d11-ba59-4ad2-8427-c28882835ad6" containerName="extract" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.047096 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a70d11-ba59-4ad2-8427-c28882835ad6" containerName="extract" Mar 19 00:34:34 crc kubenswrapper[4745]: E0319 00:34:34.047105 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f163ba-1cf2-47e5-847f-8db4eac30c29" containerName="util" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.047110 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f163ba-1cf2-47e5-847f-8db4eac30c29" containerName="util" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.047265 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="61a70d11-ba59-4ad2-8427-c28882835ad6" containerName="extract" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.047280 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="34f163ba-1cf2-47e5-847f-8db4eac30c29" containerName="extract" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.047898 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.063640 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4"] Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.067409 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-x5fd7" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.189171 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/a8895e98-6a3f-4f8a-b671-9a7920ceb390-runner\") pod \"smart-gateway-operator-ff5d8cc8d-tcxb4\" (UID: \"a8895e98-6a3f-4f8a-b671-9a7920ceb390\") " pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.189814 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68sr2\" (UniqueName: \"kubernetes.io/projected/a8895e98-6a3f-4f8a-b671-9a7920ceb390-kube-api-access-68sr2\") pod \"smart-gateway-operator-ff5d8cc8d-tcxb4\" (UID: \"a8895e98-6a3f-4f8a-b671-9a7920ceb390\") " pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.291247 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/a8895e98-6a3f-4f8a-b671-9a7920ceb390-runner\") pod \"smart-gateway-operator-ff5d8cc8d-tcxb4\" (UID: \"a8895e98-6a3f-4f8a-b671-9a7920ceb390\") " pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.291312 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68sr2\" (UniqueName: \"kubernetes.io/projected/a8895e98-6a3f-4f8a-b671-9a7920ceb390-kube-api-access-68sr2\") pod \"smart-gateway-operator-ff5d8cc8d-tcxb4\" (UID: \"a8895e98-6a3f-4f8a-b671-9a7920ceb390\") " pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.293436 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/a8895e98-6a3f-4f8a-b671-9a7920ceb390-runner\") pod \"smart-gateway-operator-ff5d8cc8d-tcxb4\" (UID: \"a8895e98-6a3f-4f8a-b671-9a7920ceb390\") " pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.324046 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68sr2\" (UniqueName: \"kubernetes.io/projected/a8895e98-6a3f-4f8a-b671-9a7920ceb390-kube-api-access-68sr2\") pod \"smart-gateway-operator-ff5d8cc8d-tcxb4\" (UID: \"a8895e98-6a3f-4f8a-b671-9a7920ceb390\") " pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.367676 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.618237 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4"] Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.658420 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" event={"ID":"a8895e98-6a3f-4f8a-b671-9a7920ceb390","Type":"ContainerStarted","Data":"fae757808023fb173f247138a7bb6d526986f626e1779762d3339598075a7ca4"} Mar 19 00:34:35 crc kubenswrapper[4745]: I0319 00:34:35.561782 4745 scope.go:117] "RemoveContainer" containerID="08e036dc6c9a44bd6fdc8a12f3525fb5e0bf5c4fdd30613e6e3e3b5a2939ce17" Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.186068 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j"] Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.187690 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.195726 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-zljj6" Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.226306 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j"] Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.347924 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/4754eb2f-bab5-413c-ab43-3b9142082c2f-runner\") pod \"service-telemetry-operator-c87c48cb6-d4c8j\" (UID: \"4754eb2f-bab5-413c-ab43-3b9142082c2f\") " pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.348027 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8gqs\" (UniqueName: \"kubernetes.io/projected/4754eb2f-bab5-413c-ab43-3b9142082c2f-kube-api-access-t8gqs\") pod \"service-telemetry-operator-c87c48cb6-d4c8j\" (UID: \"4754eb2f-bab5-413c-ab43-3b9142082c2f\") " pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.449110 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8gqs\" (UniqueName: \"kubernetes.io/projected/4754eb2f-bab5-413c-ab43-3b9142082c2f-kube-api-access-t8gqs\") pod \"service-telemetry-operator-c87c48cb6-d4c8j\" (UID: \"4754eb2f-bab5-413c-ab43-3b9142082c2f\") " pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.449209 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/4754eb2f-bab5-413c-ab43-3b9142082c2f-runner\") pod \"service-telemetry-operator-c87c48cb6-d4c8j\" (UID: \"4754eb2f-bab5-413c-ab43-3b9142082c2f\") " pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.449721 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/4754eb2f-bab5-413c-ab43-3b9142082c2f-runner\") pod \"service-telemetry-operator-c87c48cb6-d4c8j\" (UID: \"4754eb2f-bab5-413c-ab43-3b9142082c2f\") " pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.481682 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8gqs\" (UniqueName: \"kubernetes.io/projected/4754eb2f-bab5-413c-ab43-3b9142082c2f-kube-api-access-t8gqs\") pod \"service-telemetry-operator-c87c48cb6-d4c8j\" (UID: \"4754eb2f-bab5-413c-ab43-3b9142082c2f\") " pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.531544 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.855485 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j"] Mar 19 00:34:38 crc kubenswrapper[4745]: I0319 00:34:38.759342 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" event={"ID":"4754eb2f-bab5-413c-ab43-3b9142082c2f","Type":"ContainerStarted","Data":"153b914866c339974932d1e6651395a6fad755df186120bfd49a7ff0cb7ff251"} Mar 19 00:34:43 crc kubenswrapper[4745]: I0319 00:34:43.138372 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:34:43 crc kubenswrapper[4745]: E0319 00:34:43.139566 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:34:52 crc kubenswrapper[4745]: E0319 00:34:52.180772 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/smart-gateway-operator:stable-1.5" Mar 19 00:34:52 crc kubenswrapper[4745]: E0319 00:34:52.181692 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/smart-gateway-operator:stable-1.5,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:smart-gateway-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:ANSIBLE_VERBOSITY_SMARTGATEWAY_SMARTGATEWAY_INFRA_WATCH,Value:4,ValueFrom:nil,},EnvVar{Name:ANSIBLE_DEBUG_LOGS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CORE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BRIDGE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-bridge:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:smart-gateway-operator.v5.0.1773880325,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-68sr2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod smart-gateway-operator-ff5d8cc8d-tcxb4_service-telemetry(a8895e98-6a3f-4f8a-b671-9a7920ceb390): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 00:34:52 crc kubenswrapper[4745]: E0319 00:34:52.183273 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" podUID="a8895e98-6a3f-4f8a-b671-9a7920ceb390" Mar 19 00:34:52 crc kubenswrapper[4745]: E0319 00:34:52.885005 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/smart-gateway-operator:stable-1.5\\\"\"" pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" podUID="a8895e98-6a3f-4f8a-b671-9a7920ceb390" Mar 19 00:34:55 crc kubenswrapper[4745]: I0319 00:34:55.138397 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:34:55 crc kubenswrapper[4745]: E0319 00:34:55.139092 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:34:57 crc kubenswrapper[4745]: I0319 00:34:57.934914 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" event={"ID":"4754eb2f-bab5-413c-ab43-3b9142082c2f","Type":"ContainerStarted","Data":"fd141a4b6978aac22d9aaa325a3052b0e666649d82df326a720549e63a18ef11"} Mar 19 00:34:57 crc kubenswrapper[4745]: I0319 00:34:57.956144 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" podStartSLOduration=2.041218155 podStartE2EDuration="20.956120722s" podCreationTimestamp="2026-03-19 00:34:37 +0000 UTC" firstStartedPulling="2026-03-19 00:34:37.886828181 +0000 UTC m=+1642.425023312" lastFinishedPulling="2026-03-19 00:34:56.801730748 +0000 UTC m=+1661.339925879" observedRunningTime="2026-03-19 00:34:57.954212752 +0000 UTC m=+1662.492407883" watchObservedRunningTime="2026-03-19 00:34:57.956120722 +0000 UTC m=+1662.494315843" Mar 19 00:35:07 crc kubenswrapper[4745]: I0319 00:35:07.137916 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:35:07 crc kubenswrapper[4745]: E0319 00:35:07.138763 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:35:09 crc kubenswrapper[4745]: I0319 00:35:09.011413 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" event={"ID":"a8895e98-6a3f-4f8a-b671-9a7920ceb390","Type":"ContainerStarted","Data":"84bedd6b6132ec1bae853ff3bc2524b11e0cf5fe5139bbbdb11993796612bce7"} Mar 19 00:35:09 crc kubenswrapper[4745]: I0319 00:35:09.035744 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" podStartSLOduration=1.063646116 podStartE2EDuration="35.035706428s" podCreationTimestamp="2026-03-19 00:34:34 +0000 UTC" firstStartedPulling="2026-03-19 00:34:34.627257528 +0000 UTC m=+1639.165452659" lastFinishedPulling="2026-03-19 00:35:08.59931784 +0000 UTC m=+1673.137512971" observedRunningTime="2026-03-19 00:35:09.031873797 +0000 UTC m=+1673.570068938" watchObservedRunningTime="2026-03-19 00:35:09.035706428 +0000 UTC m=+1673.573901559" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.331653 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-2x4w6"] Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.333209 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.336435 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.336562 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.336782 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.337073 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.337282 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.339193 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-95txw" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.339539 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.357836 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-2x4w6"] Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.374007 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.374333 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-users\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.374433 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29b62\" (UniqueName: \"kubernetes.io/projected/3c93cb13-815f-48ec-a316-a889a6717f7c-kube-api-access-29b62\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.374504 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.374589 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.374691 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-config\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.374765 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.477502 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.477652 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.477765 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-config\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.477830 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.477952 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.478041 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-users\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.478063 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29b62\" (UniqueName: \"kubernetes.io/projected/3c93cb13-815f-48ec-a316-a889a6717f7c-kube-api-access-29b62\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.480342 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-config\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.487599 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-users\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.487763 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.488157 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.488162 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.488266 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.498488 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29b62\" (UniqueName: \"kubernetes.io/projected/3c93cb13-815f-48ec-a316-a889a6717f7c-kube-api-access-29b62\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.650078 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:23 crc kubenswrapper[4745]: I0319 00:35:22.137797 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:35:23 crc kubenswrapper[4745]: E0319 00:35:22.138468 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:35:23 crc kubenswrapper[4745]: I0319 00:35:23.866861 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-2x4w6"] Mar 19 00:35:24 crc kubenswrapper[4745]: I0319 00:35:24.148351 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" event={"ID":"3c93cb13-815f-48ec-a316-a889a6717f7c","Type":"ContainerStarted","Data":"dcfd75548cf8b7e9217b0b25a0988010bbd42dcd7ccfe6fcc357469ebb9d89b8"} Mar 19 00:35:29 crc kubenswrapper[4745]: I0319 00:35:29.192586 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" event={"ID":"3c93cb13-815f-48ec-a316-a889a6717f7c","Type":"ContainerStarted","Data":"6b68238373df0047ebeb6c52001a21e2ba8abdae07e988851a91457251aa4b36"} Mar 19 00:35:29 crc kubenswrapper[4745]: I0319 00:35:29.213281 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" podStartSLOduration=3.535487158 podStartE2EDuration="8.213251386s" podCreationTimestamp="2026-03-19 00:35:21 +0000 UTC" firstStartedPulling="2026-03-19 00:35:23.876833018 +0000 UTC m=+1688.415028189" lastFinishedPulling="2026-03-19 00:35:28.554597286 +0000 UTC m=+1693.092792417" observedRunningTime="2026-03-19 00:35:29.209849489 +0000 UTC m=+1693.748044630" watchObservedRunningTime="2026-03-19 00:35:29.213251386 +0000 UTC m=+1693.751446517" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.747539 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.749536 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.751753 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-2" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.752139 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-57c55" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.752408 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.752489 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.752412 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.753617 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.753785 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.753981 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.754295 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.759497 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-1" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.769486 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961080 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7bba4496-9224-467b-80ca-ff25c39604ec-tls-assets\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961164 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961193 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961217 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961239 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-config\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961266 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-web-config\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961333 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961353 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961471 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-958zv\" (UniqueName: \"kubernetes.io/projected/7bba4496-9224-467b-80ca-ff25c39604ec-kube-api-access-958zv\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961534 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-30734426-5e46-4a5f-89a8-e69b85bfdd3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-30734426-5e46-4a5f-89a8-e69b85bfdd3d\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961658 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7bba4496-9224-467b-80ca-ff25c39604ec-config-out\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961708 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.063482 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-web-config\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.063582 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.063622 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.063676 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-958zv\" (UniqueName: \"kubernetes.io/projected/7bba4496-9224-467b-80ca-ff25c39604ec-kube-api-access-958zv\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.063711 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-30734426-5e46-4a5f-89a8-e69b85bfdd3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-30734426-5e46-4a5f-89a8-e69b85bfdd3d\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.063761 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7bba4496-9224-467b-80ca-ff25c39604ec-config-out\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.063815 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.063871 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7bba4496-9224-467b-80ca-ff25c39604ec-tls-assets\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.063932 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.064169 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.064196 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.064224 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-config\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: E0319 00:35:33.064738 4745 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 19 00:35:33 crc kubenswrapper[4745]: E0319 00:35:33.064852 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-prometheus-proxy-tls podName:7bba4496-9224-467b-80ca-ff25c39604ec nodeName:}" failed. No retries permitted until 2026-03-19 00:35:33.564822974 +0000 UTC m=+1698.103018105 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "7bba4496-9224-467b-80ca-ff25c39604ec") : secret "default-prometheus-proxy-tls" not found Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.065472 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.065504 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.065473 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.066218 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.071020 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-config\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.071508 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-web-config\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.072502 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7bba4496-9224-467b-80ca-ff25c39604ec-tls-assets\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.073277 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7bba4496-9224-467b-80ca-ff25c39604ec-config-out\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.073715 4745 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.073758 4745 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-30734426-5e46-4a5f-89a8-e69b85bfdd3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-30734426-5e46-4a5f-89a8-e69b85bfdd3d\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9e72b50503c51e0d02fdb78c6acc0817ced455338efa3523ed94a9d283c40373/globalmount\"" pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.083254 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.085666 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-958zv\" (UniqueName: \"kubernetes.io/projected/7bba4496-9224-467b-80ca-ff25c39604ec-kube-api-access-958zv\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.098779 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-30734426-5e46-4a5f-89a8-e69b85bfdd3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-30734426-5e46-4a5f-89a8-e69b85bfdd3d\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.572761 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: E0319 00:35:33.573004 4745 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 19 00:35:33 crc kubenswrapper[4745]: E0319 00:35:33.573069 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-prometheus-proxy-tls podName:7bba4496-9224-467b-80ca-ff25c39604ec nodeName:}" failed. No retries permitted until 2026-03-19 00:35:34.573049659 +0000 UTC m=+1699.111244790 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "7bba4496-9224-467b-80ca-ff25c39604ec") : secret "default-prometheus-proxy-tls" not found Mar 19 00:35:34 crc kubenswrapper[4745]: I0319 00:35:34.589877 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:34 crc kubenswrapper[4745]: I0319 00:35:34.617504 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:34 crc kubenswrapper[4745]: I0319 00:35:34.870263 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 19 00:35:35 crc kubenswrapper[4745]: I0319 00:35:35.361163 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 19 00:35:36 crc kubenswrapper[4745]: I0319 00:35:36.265705 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"7bba4496-9224-467b-80ca-ff25c39604ec","Type":"ContainerStarted","Data":"cf8cf1057b9dce4ad67641c5f9d14b2568c1ed26325186a786b78caa6d428710"} Mar 19 00:35:37 crc kubenswrapper[4745]: I0319 00:35:37.137232 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:35:37 crc kubenswrapper[4745]: E0319 00:35:37.137940 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:35:40 crc kubenswrapper[4745]: I0319 00:35:40.300967 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"7bba4496-9224-467b-80ca-ff25c39604ec","Type":"ContainerStarted","Data":"c82a19278aef976ca00a2ceac0c95fb7154d152b86cb049420592bb81d699026"} Mar 19 00:35:43 crc kubenswrapper[4745]: I0319 00:35:43.791896 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-4pxj4"] Mar 19 00:35:43 crc kubenswrapper[4745]: I0319 00:35:43.793592 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-4pxj4" Mar 19 00:35:43 crc kubenswrapper[4745]: I0319 00:35:43.806186 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-4pxj4"] Mar 19 00:35:43 crc kubenswrapper[4745]: I0319 00:35:43.935739 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ssjs\" (UniqueName: \"kubernetes.io/projected/7ed8bcc5-1389-4cff-b64c-bf3b813f642e-kube-api-access-8ssjs\") pod \"default-snmp-webhook-6856cfb745-4pxj4\" (UID: \"7ed8bcc5-1389-4cff-b64c-bf3b813f642e\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-4pxj4" Mar 19 00:35:44 crc kubenswrapper[4745]: I0319 00:35:44.037056 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ssjs\" (UniqueName: \"kubernetes.io/projected/7ed8bcc5-1389-4cff-b64c-bf3b813f642e-kube-api-access-8ssjs\") pod \"default-snmp-webhook-6856cfb745-4pxj4\" (UID: \"7ed8bcc5-1389-4cff-b64c-bf3b813f642e\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-4pxj4" Mar 19 00:35:44 crc kubenswrapper[4745]: I0319 00:35:44.058174 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ssjs\" (UniqueName: \"kubernetes.io/projected/7ed8bcc5-1389-4cff-b64c-bf3b813f642e-kube-api-access-8ssjs\") pod \"default-snmp-webhook-6856cfb745-4pxj4\" (UID: \"7ed8bcc5-1389-4cff-b64c-bf3b813f642e\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-4pxj4" Mar 19 00:35:44 crc kubenswrapper[4745]: I0319 00:35:44.114167 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-4pxj4" Mar 19 00:35:44 crc kubenswrapper[4745]: I0319 00:35:44.344281 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-4pxj4"] Mar 19 00:35:45 crc kubenswrapper[4745]: I0319 00:35:45.348431 4745 generic.go:334] "Generic (PLEG): container finished" podID="7bba4496-9224-467b-80ca-ff25c39604ec" containerID="c82a19278aef976ca00a2ceac0c95fb7154d152b86cb049420592bb81d699026" exitCode=0 Mar 19 00:35:45 crc kubenswrapper[4745]: I0319 00:35:45.348601 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"7bba4496-9224-467b-80ca-ff25c39604ec","Type":"ContainerDied","Data":"c82a19278aef976ca00a2ceac0c95fb7154d152b86cb049420592bb81d699026"} Mar 19 00:35:45 crc kubenswrapper[4745]: I0319 00:35:45.351679 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-4pxj4" event={"ID":"7ed8bcc5-1389-4cff-b64c-bf3b813f642e","Type":"ContainerStarted","Data":"e78addcb3241e534311ac9cce6865e47d325db27204bf2125812d6ca0b948193"} Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.230575 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.235277 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.239717 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-2vrkh" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.240269 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.240519 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.240685 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.240835 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-cluster-tls-config" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.241770 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.245559 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.399671 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.399723 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.399766 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-config-volume\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.399823 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-web-config\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.399848 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-739bd674-a203-4a24-850e-e18ec05bafe0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-739bd674-a203-4a24-850e-e18ec05bafe0\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.399866 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.400016 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-tls-assets\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.400059 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-config-out\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.400100 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smmkd\" (UniqueName: \"kubernetes.io/projected/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-kube-api-access-smmkd\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.501859 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.501967 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.501998 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-config-volume\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: E0319 00:35:47.502086 4745 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 19 00:35:47 crc kubenswrapper[4745]: E0319 00:35:47.503067 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls podName:0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7 nodeName:}" failed. No retries permitted until 2026-03-19 00:35:48.003030383 +0000 UTC m=+1712.541225514 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7") : secret "default-alertmanager-proxy-tls" not found Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.503147 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-739bd674-a203-4a24-850e-e18ec05bafe0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-739bd674-a203-4a24-850e-e18ec05bafe0\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.503247 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-web-config\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.503342 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.503402 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-tls-assets\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.503525 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-config-out\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.503922 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smmkd\" (UniqueName: \"kubernetes.io/projected/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-kube-api-access-smmkd\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.512408 4745 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.512591 4745 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-739bd674-a203-4a24-850e-e18ec05bafe0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-739bd674-a203-4a24-850e-e18ec05bafe0\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f44b85b436fe394314838612ea3934316cfd49736989031f5eaeb5e07b616f52/globalmount\"" pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.513459 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-tls-assets\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.513137 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-config-out\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.514004 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.514399 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-web-config\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.534170 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smmkd\" (UniqueName: \"kubernetes.io/projected/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-kube-api-access-smmkd\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.539208 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-config-volume\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.562638 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.567938 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-739bd674-a203-4a24-850e-e18ec05bafe0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-739bd674-a203-4a24-850e-e18ec05bafe0\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:48 crc kubenswrapper[4745]: I0319 00:35:48.017558 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:48 crc kubenswrapper[4745]: E0319 00:35:48.017772 4745 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 19 00:35:48 crc kubenswrapper[4745]: E0319 00:35:48.017833 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls podName:0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7 nodeName:}" failed. No retries permitted until 2026-03-19 00:35:49.017817293 +0000 UTC m=+1713.556012424 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7") : secret "default-alertmanager-proxy-tls" not found Mar 19 00:35:48 crc kubenswrapper[4745]: I0319 00:35:48.138728 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:35:48 crc kubenswrapper[4745]: E0319 00:35:48.139176 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:35:49 crc kubenswrapper[4745]: I0319 00:35:49.032206 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:49 crc kubenswrapper[4745]: E0319 00:35:49.032420 4745 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 19 00:35:49 crc kubenswrapper[4745]: E0319 00:35:49.032749 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls podName:0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7 nodeName:}" failed. No retries permitted until 2026-03-19 00:35:51.032724424 +0000 UTC m=+1715.570919555 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7") : secret "default-alertmanager-proxy-tls" not found Mar 19 00:35:51 crc kubenswrapper[4745]: I0319 00:35:51.067808 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:51 crc kubenswrapper[4745]: E0319 00:35:51.068097 4745 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 19 00:35:51 crc kubenswrapper[4745]: E0319 00:35:51.068390 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls podName:0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7 nodeName:}" failed. No retries permitted until 2026-03-19 00:35:55.068366689 +0000 UTC m=+1719.606561820 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7") : secret "default-alertmanager-proxy-tls" not found Mar 19 00:35:55 crc kubenswrapper[4745]: I0319 00:35:55.135653 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:55 crc kubenswrapper[4745]: I0319 00:35:55.155827 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:55 crc kubenswrapper[4745]: I0319 00:35:55.370118 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:56 crc kubenswrapper[4745]: I0319 00:35:56.198612 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 19 00:35:56 crc kubenswrapper[4745]: W0319 00:35:56.202581 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a13ce6d_9eb6_4d56_adfd_7c51a426c4b7.slice/crio-b539c240599a8a5df637c538d065d5f958eb9c22a81fd6d40b2ec11668d1c232 WatchSource:0}: Error finding container b539c240599a8a5df637c538d065d5f958eb9c22a81fd6d40b2ec11668d1c232: Status 404 returned error can't find the container with id b539c240599a8a5df637c538d065d5f958eb9c22a81fd6d40b2ec11668d1c232 Mar 19 00:35:56 crc kubenswrapper[4745]: I0319 00:35:56.452068 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"7bba4496-9224-467b-80ca-ff25c39604ec","Type":"ContainerStarted","Data":"3ca731c08f48fe75ca37730ddc55af586eaf24640cfc01e5251547d8fe2dcf32"} Mar 19 00:35:56 crc kubenswrapper[4745]: I0319 00:35:56.453788 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-4pxj4" event={"ID":"7ed8bcc5-1389-4cff-b64c-bf3b813f642e","Type":"ContainerStarted","Data":"87f458034926bc2a409c50354a3eaeb72e25a722731956d92cfb42a63e9f3640"} Mar 19 00:35:56 crc kubenswrapper[4745]: I0319 00:35:56.456319 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7","Type":"ContainerStarted","Data":"b539c240599a8a5df637c538d065d5f958eb9c22a81fd6d40b2ec11668d1c232"} Mar 19 00:35:59 crc kubenswrapper[4745]: I0319 00:35:59.479542 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"7bba4496-9224-467b-80ca-ff25c39604ec","Type":"ContainerStarted","Data":"0491f36c0bfcc256f95f552285dd5d8bf9df1e78fe12f7e0dc4b452a81746238"} Mar 19 00:35:59 crc kubenswrapper[4745]: I0319 00:35:59.482212 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7","Type":"ContainerStarted","Data":"11c715d2f9fac89bb8a3a7830b902ae3132a5417eb1f5a0f573327daa0282902"} Mar 19 00:35:59 crc kubenswrapper[4745]: I0319 00:35:59.509833 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-6856cfb745-4pxj4" podStartSLOduration=5.385977137 podStartE2EDuration="16.509788367s" podCreationTimestamp="2026-03-19 00:35:43 +0000 UTC" firstStartedPulling="2026-03-19 00:35:44.354547593 +0000 UTC m=+1708.892742724" lastFinishedPulling="2026-03-19 00:35:55.478358823 +0000 UTC m=+1720.016553954" observedRunningTime="2026-03-19 00:35:56.468727416 +0000 UTC m=+1721.006922547" watchObservedRunningTime="2026-03-19 00:35:59.509788367 +0000 UTC m=+1724.047983508" Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.140060 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:36:00 crc kubenswrapper[4745]: E0319 00:36:00.140836 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.171337 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564676-6jsmm"] Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.172124 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564676-6jsmm"] Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.172225 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564676-6jsmm" Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.175338 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.175567 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.175735 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.217968 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctsf2\" (UniqueName: \"kubernetes.io/projected/b28c36f9-bd14-4de4-a0b3-e3f5e9131f60-kube-api-access-ctsf2\") pod \"auto-csr-approver-29564676-6jsmm\" (UID: \"b28c36f9-bd14-4de4-a0b3-e3f5e9131f60\") " pod="openshift-infra/auto-csr-approver-29564676-6jsmm" Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.322302 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctsf2\" (UniqueName: \"kubernetes.io/projected/b28c36f9-bd14-4de4-a0b3-e3f5e9131f60-kube-api-access-ctsf2\") pod \"auto-csr-approver-29564676-6jsmm\" (UID: \"b28c36f9-bd14-4de4-a0b3-e3f5e9131f60\") " pod="openshift-infra/auto-csr-approver-29564676-6jsmm" Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.361760 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctsf2\" (UniqueName: \"kubernetes.io/projected/b28c36f9-bd14-4de4-a0b3-e3f5e9131f60-kube-api-access-ctsf2\") pod \"auto-csr-approver-29564676-6jsmm\" (UID: \"b28c36f9-bd14-4de4-a0b3-e3f5e9131f60\") " pod="openshift-infra/auto-csr-approver-29564676-6jsmm" Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.500192 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564676-6jsmm" Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.997205 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564676-6jsmm"] Mar 19 00:36:01 crc kubenswrapper[4745]: I0319 00:36:01.508278 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564676-6jsmm" event={"ID":"b28c36f9-bd14-4de4-a0b3-e3f5e9131f60","Type":"ContainerStarted","Data":"70fe04024e13a78c92725ba2469acbf2db19f7926717f6ee559cb02f5b6394d3"} Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.848407 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg"] Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.854040 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.862138 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.862189 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.862464 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-7zvhj" Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.863030 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.878711 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg"] Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.972568 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/71276875-43be-4d09-a25d-4327369c3a53-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.972651 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qqj8\" (UniqueName: \"kubernetes.io/projected/71276875-43be-4d09-a25d-4327369c3a53-kube-api-access-5qqj8\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.972795 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.972841 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.972927 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/71276875-43be-4d09-a25d-4327369c3a53-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: I0319 00:36:03.074468 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/71276875-43be-4d09-a25d-4327369c3a53-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: I0319 00:36:03.074568 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qqj8\" (UniqueName: \"kubernetes.io/projected/71276875-43be-4d09-a25d-4327369c3a53-kube-api-access-5qqj8\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: I0319 00:36:03.074705 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: I0319 00:36:03.074747 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: I0319 00:36:03.074820 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/71276875-43be-4d09-a25d-4327369c3a53-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: I0319 00:36:03.075090 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/71276875-43be-4d09-a25d-4327369c3a53-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: E0319 00:36:03.075331 4745 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 19 00:36:03 crc kubenswrapper[4745]: E0319 00:36:03.075460 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-default-cloud1-coll-meter-proxy-tls podName:71276875-43be-4d09-a25d-4327369c3a53 nodeName:}" failed. No retries permitted until 2026-03-19 00:36:03.575429792 +0000 UTC m=+1728.113625103 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" (UID: "71276875-43be-4d09-a25d-4327369c3a53") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 19 00:36:03 crc kubenswrapper[4745]: I0319 00:36:03.076635 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/71276875-43be-4d09-a25d-4327369c3a53-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: I0319 00:36:03.091238 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: I0319 00:36:03.097777 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qqj8\" (UniqueName: \"kubernetes.io/projected/71276875-43be-4d09-a25d-4327369c3a53-kube-api-access-5qqj8\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: I0319 00:36:03.590742 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: E0319 00:36:03.591024 4745 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 19 00:36:03 crc kubenswrapper[4745]: E0319 00:36:03.591385 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-default-cloud1-coll-meter-proxy-tls podName:71276875-43be-4d09-a25d-4327369c3a53 nodeName:}" failed. No retries permitted until 2026-03-19 00:36:04.591356898 +0000 UTC m=+1729.129552029 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" (UID: "71276875-43be-4d09-a25d-4327369c3a53") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 19 00:36:04 crc kubenswrapper[4745]: I0319 00:36:04.531833 4745 generic.go:334] "Generic (PLEG): container finished" podID="0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7" containerID="11c715d2f9fac89bb8a3a7830b902ae3132a5417eb1f5a0f573327daa0282902" exitCode=0 Mar 19 00:36:04 crc kubenswrapper[4745]: I0319 00:36:04.531903 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7","Type":"ContainerDied","Data":"11c715d2f9fac89bb8a3a7830b902ae3132a5417eb1f5a0f573327daa0282902"} Mar 19 00:36:04 crc kubenswrapper[4745]: I0319 00:36:04.610252 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:04 crc kubenswrapper[4745]: I0319 00:36:04.615755 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:04 crc kubenswrapper[4745]: I0319 00:36:04.685245 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.328276 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6"] Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.336019 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.339391 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.339609 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.343177 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6"] Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.439548 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.439626 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.439653 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzbqf\" (UniqueName: \"kubernetes.io/projected/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-kube-api-access-rzbqf\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.439693 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.439713 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.541247 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.541767 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.541805 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzbqf\" (UniqueName: \"kubernetes.io/projected/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-kube-api-access-rzbqf\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.541869 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.541917 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: E0319 00:36:06.542071 4745 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 19 00:36:06 crc kubenswrapper[4745]: E0319 00:36:06.542209 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-default-cloud1-ceil-meter-proxy-tls podName:aa12986d-ffa2-4a08-9069-77fc4fdd80c6 nodeName:}" failed. No retries permitted until 2026-03-19 00:36:07.042162886 +0000 UTC m=+1731.580358187 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" (UID: "aa12986d-ffa2-4a08-9069-77fc4fdd80c6") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.542271 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.543459 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.558083 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.565915 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzbqf\" (UniqueName: \"kubernetes.io/projected/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-kube-api-access-rzbqf\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.597442 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"7bba4496-9224-467b-80ca-ff25c39604ec","Type":"ContainerStarted","Data":"14115b4c7fc81ef3247bc1ed47b86b4e33a0eb9ca5358e50ae275dbb80cdfd00"} Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.629775 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg"] Mar 19 00:36:06 crc kubenswrapper[4745]: W0319 00:36:06.991252 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71276875_43be_4d09_a25d_4327369c3a53.slice/crio-e07c38ba1bac5ae91b146d8b828f53be2bf680707632b11a484bd2648a17362a WatchSource:0}: Error finding container e07c38ba1bac5ae91b146d8b828f53be2bf680707632b11a484bd2648a17362a: Status 404 returned error can't find the container with id e07c38ba1bac5ae91b146d8b828f53be2bf680707632b11a484bd2648a17362a Mar 19 00:36:07 crc kubenswrapper[4745]: I0319 00:36:07.049302 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:07 crc kubenswrapper[4745]: E0319 00:36:07.049536 4745 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 19 00:36:07 crc kubenswrapper[4745]: E0319 00:36:07.049648 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-default-cloud1-ceil-meter-proxy-tls podName:aa12986d-ffa2-4a08-9069-77fc4fdd80c6 nodeName:}" failed. No retries permitted until 2026-03-19 00:36:08.049619057 +0000 UTC m=+1732.587814188 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" (UID: "aa12986d-ffa2-4a08-9069-77fc4fdd80c6") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 19 00:36:07 crc kubenswrapper[4745]: I0319 00:36:07.607051 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" event={"ID":"71276875-43be-4d09-a25d-4327369c3a53","Type":"ContainerStarted","Data":"e07c38ba1bac5ae91b146d8b828f53be2bf680707632b11a484bd2648a17362a"} Mar 19 00:36:07 crc kubenswrapper[4745]: I0319 00:36:07.609392 4745 generic.go:334] "Generic (PLEG): container finished" podID="b28c36f9-bd14-4de4-a0b3-e3f5e9131f60" containerID="4de285e74e641eb21f2a1ee98f32c5f610d3c8d1a0fc10bb8a444c82e684e43e" exitCode=0 Mar 19 00:36:07 crc kubenswrapper[4745]: I0319 00:36:07.609497 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564676-6jsmm" event={"ID":"b28c36f9-bd14-4de4-a0b3-e3f5e9131f60","Type":"ContainerDied","Data":"4de285e74e641eb21f2a1ee98f32c5f610d3c8d1a0fc10bb8a444c82e684e43e"} Mar 19 00:36:07 crc kubenswrapper[4745]: I0319 00:36:07.625448 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=5.775374703 podStartE2EDuration="36.625423395s" podCreationTimestamp="2026-03-19 00:35:31 +0000 UTC" firstStartedPulling="2026-03-19 00:35:35.350558811 +0000 UTC m=+1699.888753942" lastFinishedPulling="2026-03-19 00:36:06.200607503 +0000 UTC m=+1730.738802634" observedRunningTime="2026-03-19 00:36:06.641248304 +0000 UTC m=+1731.179443435" watchObservedRunningTime="2026-03-19 00:36:07.625423395 +0000 UTC m=+1732.163618526" Mar 19 00:36:08 crc kubenswrapper[4745]: I0319 00:36:08.074252 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:08 crc kubenswrapper[4745]: I0319 00:36:08.081596 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:08 crc kubenswrapper[4745]: I0319 00:36:08.163641 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:08 crc kubenswrapper[4745]: I0319 00:36:08.611815 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6"] Mar 19 00:36:08 crc kubenswrapper[4745]: I0319 00:36:08.621988 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7","Type":"ContainerStarted","Data":"3cd590c42af1c4dd91a1a45086b1a788efa5679fd559921573aa95fc57e49918"} Mar 19 00:36:08 crc kubenswrapper[4745]: I0319 00:36:08.623926 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" event={"ID":"71276875-43be-4d09-a25d-4327369c3a53","Type":"ContainerStarted","Data":"5fb4a33900be022415bf042c720b7ab321b0b8282f6497c694b151ca05ad4fd6"} Mar 19 00:36:08 crc kubenswrapper[4745]: W0319 00:36:08.646646 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa12986d_ffa2_4a08_9069_77fc4fdd80c6.slice/crio-eb6c1d5653b9a9ca68527a19608f429de46ae832c7cf61b1326d14c93e9faaa8 WatchSource:0}: Error finding container eb6c1d5653b9a9ca68527a19608f429de46ae832c7cf61b1326d14c93e9faaa8: Status 404 returned error can't find the container with id eb6c1d5653b9a9ca68527a19608f429de46ae832c7cf61b1326d14c93e9faaa8 Mar 19 00:36:08 crc kubenswrapper[4745]: I0319 00:36:08.863281 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564676-6jsmm" Mar 19 00:36:08 crc kubenswrapper[4745]: I0319 00:36:08.897548 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctsf2\" (UniqueName: \"kubernetes.io/projected/b28c36f9-bd14-4de4-a0b3-e3f5e9131f60-kube-api-access-ctsf2\") pod \"b28c36f9-bd14-4de4-a0b3-e3f5e9131f60\" (UID: \"b28c36f9-bd14-4de4-a0b3-e3f5e9131f60\") " Mar 19 00:36:08 crc kubenswrapper[4745]: I0319 00:36:08.904591 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b28c36f9-bd14-4de4-a0b3-e3f5e9131f60-kube-api-access-ctsf2" (OuterVolumeSpecName: "kube-api-access-ctsf2") pod "b28c36f9-bd14-4de4-a0b3-e3f5e9131f60" (UID: "b28c36f9-bd14-4de4-a0b3-e3f5e9131f60"). InnerVolumeSpecName "kube-api-access-ctsf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:36:09 crc kubenswrapper[4745]: I0319 00:36:08.999597 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctsf2\" (UniqueName: \"kubernetes.io/projected/b28c36f9-bd14-4de4-a0b3-e3f5e9131f60-kube-api-access-ctsf2\") on node \"crc\" DevicePath \"\"" Mar 19 00:36:09 crc kubenswrapper[4745]: I0319 00:36:09.632647 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564676-6jsmm" Mar 19 00:36:09 crc kubenswrapper[4745]: I0319 00:36:09.632652 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564676-6jsmm" event={"ID":"b28c36f9-bd14-4de4-a0b3-e3f5e9131f60","Type":"ContainerDied","Data":"70fe04024e13a78c92725ba2469acbf2db19f7926717f6ee559cb02f5b6394d3"} Mar 19 00:36:09 crc kubenswrapper[4745]: I0319 00:36:09.632819 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70fe04024e13a78c92725ba2469acbf2db19f7926717f6ee559cb02f5b6394d3" Mar 19 00:36:09 crc kubenswrapper[4745]: I0319 00:36:09.633824 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" event={"ID":"aa12986d-ffa2-4a08-9069-77fc4fdd80c6","Type":"ContainerStarted","Data":"eb6c1d5653b9a9ca68527a19608f429de46ae832c7cf61b1326d14c93e9faaa8"} Mar 19 00:36:09 crc kubenswrapper[4745]: I0319 00:36:09.870935 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Mar 19 00:36:09 crc kubenswrapper[4745]: I0319 00:36:09.941039 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564670-8sw74"] Mar 19 00:36:09 crc kubenswrapper[4745]: I0319 00:36:09.947715 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564670-8sw74"] Mar 19 00:36:10 crc kubenswrapper[4745]: I0319 00:36:10.145914 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a566f97-13b9-4fde-868a-f55bd82a1af6" path="/var/lib/kubelet/pods/9a566f97-13b9-4fde-868a-f55bd82a1af6/volumes" Mar 19 00:36:10 crc kubenswrapper[4745]: I0319 00:36:10.645964 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" event={"ID":"71276875-43be-4d09-a25d-4327369c3a53","Type":"ContainerStarted","Data":"d06419cf324315fc94fd28fa12b688584ca303ebd7cd1fabeb84325363ee24a7"} Mar 19 00:36:10 crc kubenswrapper[4745]: I0319 00:36:10.655833 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" event={"ID":"aa12986d-ffa2-4a08-9069-77fc4fdd80c6","Type":"ContainerStarted","Data":"c09f6fd57b4510a6536ce6cede5c5d0507077eb14b1eeafd9132c5c0313f2a67"} Mar 19 00:36:10 crc kubenswrapper[4745]: I0319 00:36:10.655932 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" event={"ID":"aa12986d-ffa2-4a08-9069-77fc4fdd80c6","Type":"ContainerStarted","Data":"8ad5b98b5dda413ed20a29983acc0b2407844b57318959de371275cfcb8944ac"} Mar 19 00:36:10 crc kubenswrapper[4745]: I0319 00:36:10.659135 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7","Type":"ContainerStarted","Data":"93d8d19ccb8cad93714b317ffb236f4f51edc43950a55d0e225f5ed61d3f97a7"} Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.274758 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl"] Mar 19 00:36:11 crc kubenswrapper[4745]: E0319 00:36:11.275211 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b28c36f9-bd14-4de4-a0b3-e3f5e9131f60" containerName="oc" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.275238 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b28c36f9-bd14-4de4-a0b3-e3f5e9131f60" containerName="oc" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.275406 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="b28c36f9-bd14-4de4-a0b3-e3f5e9131f60" containerName="oc" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.276741 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.280312 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.280611 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.288688 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl"] Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.337170 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/651a6724-09fd-4395-859f-7fdff0781163-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.337264 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czp9r\" (UniqueName: \"kubernetes.io/projected/651a6724-09fd-4395-859f-7fdff0781163-kube-api-access-czp9r\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.337317 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.337376 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.337408 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/651a6724-09fd-4395-859f-7fdff0781163-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.438920 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czp9r\" (UniqueName: \"kubernetes.io/projected/651a6724-09fd-4395-859f-7fdff0781163-kube-api-access-czp9r\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.439009 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.439055 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.439080 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/651a6724-09fd-4395-859f-7fdff0781163-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.439124 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/651a6724-09fd-4395-859f-7fdff0781163-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: E0319 00:36:11.439336 4745 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 19 00:36:11 crc kubenswrapper[4745]: E0319 00:36:11.439464 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-default-cloud1-sens-meter-proxy-tls podName:651a6724-09fd-4395-859f-7fdff0781163 nodeName:}" failed. No retries permitted until 2026-03-19 00:36:11.939432159 +0000 UTC m=+1736.477627290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" (UID: "651a6724-09fd-4395-859f-7fdff0781163") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.439976 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/651a6724-09fd-4395-859f-7fdff0781163-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.440302 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/651a6724-09fd-4395-859f-7fdff0781163-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.462670 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czp9r\" (UniqueName: \"kubernetes.io/projected/651a6724-09fd-4395-859f-7fdff0781163-kube-api-access-czp9r\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.463280 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.674756 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7","Type":"ContainerStarted","Data":"76c77b4195939436d825229401dfff783f924882281e352d295e60a3c2c38635"} Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.949669 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: E0319 00:36:11.951471 4745 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 19 00:36:11 crc kubenswrapper[4745]: E0319 00:36:11.951567 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-default-cloud1-sens-meter-proxy-tls podName:651a6724-09fd-4395-859f-7fdff0781163 nodeName:}" failed. No retries permitted until 2026-03-19 00:36:12.951520084 +0000 UTC m=+1737.489715215 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" (UID: "651a6724-09fd-4395-859f-7fdff0781163") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 19 00:36:12 crc kubenswrapper[4745]: I0319 00:36:12.137955 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:36:12 crc kubenswrapper[4745]: E0319 00:36:12.138231 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:36:12 crc kubenswrapper[4745]: I0319 00:36:12.969707 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:12 crc kubenswrapper[4745]: I0319 00:36:12.977018 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:13 crc kubenswrapper[4745]: I0319 00:36:13.120357 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:15 crc kubenswrapper[4745]: I0319 00:36:15.174427 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=22.614174917 podStartE2EDuration="29.17440589s" podCreationTimestamp="2026-03-19 00:35:46 +0000 UTC" firstStartedPulling="2026-03-19 00:36:04.534785354 +0000 UTC m=+1729.072980485" lastFinishedPulling="2026-03-19 00:36:11.095016327 +0000 UTC m=+1735.633211458" observedRunningTime="2026-03-19 00:36:11.707793082 +0000 UTC m=+1736.245988243" watchObservedRunningTime="2026-03-19 00:36:15.17440589 +0000 UTC m=+1739.712601021" Mar 19 00:36:15 crc kubenswrapper[4745]: I0319 00:36:15.180436 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl"] Mar 19 00:36:15 crc kubenswrapper[4745]: I0319 00:36:15.718419 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" event={"ID":"71276875-43be-4d09-a25d-4327369c3a53","Type":"ContainerStarted","Data":"5c3373afb0e47ceff526faab0bd50eaca0309fc30b318776cfe25d7bfa420547"} Mar 19 00:36:15 crc kubenswrapper[4745]: I0319 00:36:15.720461 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" event={"ID":"651a6724-09fd-4395-859f-7fdff0781163","Type":"ContainerStarted","Data":"e855459e41a66e7cd0597ff7c75b2951087cdc755d5e4b354ce12c2fde8728e6"} Mar 19 00:36:15 crc kubenswrapper[4745]: I0319 00:36:15.722331 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" event={"ID":"aa12986d-ffa2-4a08-9069-77fc4fdd80c6","Type":"ContainerStarted","Data":"00d724fb7267f4bd8b88cf89d7e346caef43a54f15860ea5c23aef5b049931a8"} Mar 19 00:36:15 crc kubenswrapper[4745]: I0319 00:36:15.768496 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" podStartSLOduration=6.087772744 podStartE2EDuration="13.76846798s" podCreationTimestamp="2026-03-19 00:36:02 +0000 UTC" firstStartedPulling="2026-03-19 00:36:06.99669713 +0000 UTC m=+1731.534892261" lastFinishedPulling="2026-03-19 00:36:14.677392366 +0000 UTC m=+1739.215587497" observedRunningTime="2026-03-19 00:36:15.744740298 +0000 UTC m=+1740.282935419" watchObservedRunningTime="2026-03-19 00:36:15.76846798 +0000 UTC m=+1740.306663101" Mar 19 00:36:15 crc kubenswrapper[4745]: I0319 00:36:15.769673 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" podStartSLOduration=3.6944758540000002 podStartE2EDuration="9.769663177s" podCreationTimestamp="2026-03-19 00:36:06 +0000 UTC" firstStartedPulling="2026-03-19 00:36:08.650108692 +0000 UTC m=+1733.188303823" lastFinishedPulling="2026-03-19 00:36:14.725296015 +0000 UTC m=+1739.263491146" observedRunningTime="2026-03-19 00:36:15.763958609 +0000 UTC m=+1740.302153760" watchObservedRunningTime="2026-03-19 00:36:15.769663177 +0000 UTC m=+1740.307858308" Mar 19 00:36:16 crc kubenswrapper[4745]: I0319 00:36:16.734323 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" event={"ID":"651a6724-09fd-4395-859f-7fdff0781163","Type":"ContainerStarted","Data":"b715fd563da2fd37e54c08026dd1465560a32dc988ef695b1f663c83ecd1a0ba"} Mar 19 00:36:16 crc kubenswrapper[4745]: I0319 00:36:16.734814 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" event={"ID":"651a6724-09fd-4395-859f-7fdff0781163","Type":"ContainerStarted","Data":"76c6559d5164fbe901447e34e84a6f2a925b8533c5eaa0bf05828b32a7fd485b"} Mar 19 00:36:16 crc kubenswrapper[4745]: I0319 00:36:16.734834 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" event={"ID":"651a6724-09fd-4395-859f-7fdff0781163","Type":"ContainerStarted","Data":"69def65840cec58199c78c09533c14fbf3ae19b610fc97a7d7bfa870ad7cc13f"} Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.409595 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" podStartSLOduration=7.418241052 podStartE2EDuration="8.409567726s" podCreationTimestamp="2026-03-19 00:36:11 +0000 UTC" firstStartedPulling="2026-03-19 00:36:15.191226156 +0000 UTC m=+1739.729421287" lastFinishedPulling="2026-03-19 00:36:16.18255284 +0000 UTC m=+1740.720747961" observedRunningTime="2026-03-19 00:36:16.762256131 +0000 UTC m=+1741.300451262" watchObservedRunningTime="2026-03-19 00:36:19.409567726 +0000 UTC m=+1743.947762878" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.413345 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c"] Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.415016 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.418185 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.418577 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.423231 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c"] Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.574911 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d506c7b-246f-4aca-b3ac-635dbc53b579-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.575375 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/1d506c7b-246f-4aca-b3ac-635dbc53b579-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.575628 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1d506c7b-246f-4aca-b3ac-635dbc53b579-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.575702 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlkvs\" (UniqueName: \"kubernetes.io/projected/1d506c7b-246f-4aca-b3ac-635dbc53b579-kube-api-access-xlkvs\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.676866 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d506c7b-246f-4aca-b3ac-635dbc53b579-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.677423 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/1d506c7b-246f-4aca-b3ac-635dbc53b579-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.677492 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1d506c7b-246f-4aca-b3ac-635dbc53b579-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.677525 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlkvs\" (UniqueName: \"kubernetes.io/projected/1d506c7b-246f-4aca-b3ac-635dbc53b579-kube-api-access-xlkvs\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.677531 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d506c7b-246f-4aca-b3ac-635dbc53b579-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.678599 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1d506c7b-246f-4aca-b3ac-635dbc53b579-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.686560 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/1d506c7b-246f-4aca-b3ac-635dbc53b579-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.700641 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlkvs\" (UniqueName: \"kubernetes.io/projected/1d506c7b-246f-4aca-b3ac-635dbc53b579-kube-api-access-xlkvs\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.735398 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.871311 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.927177 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Mar 19 00:36:20 crc kubenswrapper[4745]: I0319 00:36:20.250762 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c"] Mar 19 00:36:20 crc kubenswrapper[4745]: I0319 00:36:20.766324 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" event={"ID":"1d506c7b-246f-4aca-b3ac-635dbc53b579","Type":"ContainerStarted","Data":"d1da666fbdebf65ceda1716f591828f9d729c6cab2fd5294e467d4cf6d41ddd9"} Mar 19 00:36:20 crc kubenswrapper[4745]: I0319 00:36:20.812211 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Mar 19 00:36:20 crc kubenswrapper[4745]: I0319 00:36:20.972622 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw"] Mar 19 00:36:20 crc kubenswrapper[4745]: I0319 00:36:20.975151 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:20 crc kubenswrapper[4745]: I0319 00:36:20.984006 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw"] Mar 19 00:36:20 crc kubenswrapper[4745]: I0319 00:36:20.988347 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.114675 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f88023b4-4c23-4946-a12b-3f0cdab93771-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.114764 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbvmf\" (UniqueName: \"kubernetes.io/projected/f88023b4-4c23-4946-a12b-3f0cdab93771-kube-api-access-jbvmf\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.114798 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f88023b4-4c23-4946-a12b-3f0cdab93771-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.114844 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/f88023b4-4c23-4946-a12b-3f0cdab93771-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.216824 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f88023b4-4c23-4946-a12b-3f0cdab93771-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.216942 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbvmf\" (UniqueName: \"kubernetes.io/projected/f88023b4-4c23-4946-a12b-3f0cdab93771-kube-api-access-jbvmf\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.216985 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f88023b4-4c23-4946-a12b-3f0cdab93771-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.217029 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/f88023b4-4c23-4946-a12b-3f0cdab93771-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.218790 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f88023b4-4c23-4946-a12b-3f0cdab93771-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.218988 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f88023b4-4c23-4946-a12b-3f0cdab93771-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.237720 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/f88023b4-4c23-4946-a12b-3f0cdab93771-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.241554 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbvmf\" (UniqueName: \"kubernetes.io/projected/f88023b4-4c23-4946-a12b-3f0cdab93771-kube-api-access-jbvmf\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.322703 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:26 crc kubenswrapper[4745]: I0319 00:36:26.149133 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:36:26 crc kubenswrapper[4745]: E0319 00:36:26.150288 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:36:29 crc kubenswrapper[4745]: I0319 00:36:29.100296 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw"] Mar 19 00:36:29 crc kubenswrapper[4745]: I0319 00:36:29.164800 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" event={"ID":"f88023b4-4c23-4946-a12b-3f0cdab93771","Type":"ContainerStarted","Data":"cd1becfe8def766c68f51f1f70e281ff8b63a047134e521aeccb27fcb24da6af"} Mar 19 00:36:30 crc kubenswrapper[4745]: I0319 00:36:30.174943 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" event={"ID":"1d506c7b-246f-4aca-b3ac-635dbc53b579","Type":"ContainerStarted","Data":"5db347cbdc2cb16a52036d40008b6863e1e50d39a28cedc5cbcc6eaf56324ba9"} Mar 19 00:36:30 crc kubenswrapper[4745]: I0319 00:36:30.175786 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" event={"ID":"1d506c7b-246f-4aca-b3ac-635dbc53b579","Type":"ContainerStarted","Data":"cfb887723d4d194f268766af8ee4a172a3ee0c93bac5baa127b409ccb719c01a"} Mar 19 00:36:30 crc kubenswrapper[4745]: I0319 00:36:30.177710 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" event={"ID":"f88023b4-4c23-4946-a12b-3f0cdab93771","Type":"ContainerStarted","Data":"2af68900592a904c84b38f8c758a12ed12be59dab65307542ab1213e16f32357"} Mar 19 00:36:30 crc kubenswrapper[4745]: I0319 00:36:30.177761 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" event={"ID":"f88023b4-4c23-4946-a12b-3f0cdab93771","Type":"ContainerStarted","Data":"46eca11823ff9b357cffb37437e48eab6984b261c9db2831eaf601f907a099e2"} Mar 19 00:36:30 crc kubenswrapper[4745]: I0319 00:36:30.202000 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" podStartSLOduration=1.7290795829999999 podStartE2EDuration="11.201971569s" podCreationTimestamp="2026-03-19 00:36:19 +0000 UTC" firstStartedPulling="2026-03-19 00:36:20.247247804 +0000 UTC m=+1744.785442935" lastFinishedPulling="2026-03-19 00:36:29.72013979 +0000 UTC m=+1754.258334921" observedRunningTime="2026-03-19 00:36:30.197712655 +0000 UTC m=+1754.735907796" watchObservedRunningTime="2026-03-19 00:36:30.201971569 +0000 UTC m=+1754.740166720" Mar 19 00:36:30 crc kubenswrapper[4745]: I0319 00:36:30.217707 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" podStartSLOduration=9.396766983 podStartE2EDuration="10.21768191s" podCreationTimestamp="2026-03-19 00:36:20 +0000 UTC" firstStartedPulling="2026-03-19 00:36:29.106242906 +0000 UTC m=+1753.644438037" lastFinishedPulling="2026-03-19 00:36:29.927157833 +0000 UTC m=+1754.465352964" observedRunningTime="2026-03-19 00:36:30.214757799 +0000 UTC m=+1754.752952930" watchObservedRunningTime="2026-03-19 00:36:30.21768191 +0000 UTC m=+1754.755877041" Mar 19 00:36:35 crc kubenswrapper[4745]: I0319 00:36:35.693509 4745 scope.go:117] "RemoveContainer" containerID="e1e89ee6fc2c85074b56c8f19c7bf183b3c352108812ec9dcafde77f229e8ca5" Mar 19 00:36:35 crc kubenswrapper[4745]: I0319 00:36:35.742106 4745 scope.go:117] "RemoveContainer" containerID="ce8a63b6903edaf1dfec62801e62f011b41ac609121cec118da8bcbd296b697b" Mar 19 00:36:35 crc kubenswrapper[4745]: I0319 00:36:35.776201 4745 scope.go:117] "RemoveContainer" containerID="649af08f705b85b72e2b308ff29da2e7d5edce2a304ca0c563098fa2b731a46b" Mar 19 00:36:37 crc kubenswrapper[4745]: I0319 00:36:37.137892 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:36:37 crc kubenswrapper[4745]: E0319 00:36:37.138664 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:36:37 crc kubenswrapper[4745]: I0319 00:36:37.777869 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-2x4w6"] Mar 19 00:36:37 crc kubenswrapper[4745]: I0319 00:36:37.778242 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" podUID="3c93cb13-815f-48ec-a316-a889a6717f7c" containerName="default-interconnect" containerID="cri-o://6b68238373df0047ebeb6c52001a21e2ba8abdae07e988851a91457251aa4b36" gracePeriod=30 Mar 19 00:36:38 crc kubenswrapper[4745]: I0319 00:36:38.665211 4745 generic.go:334] "Generic (PLEG): container finished" podID="3c93cb13-815f-48ec-a316-a889a6717f7c" containerID="6b68238373df0047ebeb6c52001a21e2ba8abdae07e988851a91457251aa4b36" exitCode=0 Mar 19 00:36:38 crc kubenswrapper[4745]: I0319 00:36:38.665716 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" event={"ID":"3c93cb13-815f-48ec-a316-a889a6717f7c","Type":"ContainerDied","Data":"6b68238373df0047ebeb6c52001a21e2ba8abdae07e988851a91457251aa4b36"} Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.009867 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.049068 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-wgvm9"] Mar 19 00:36:39 crc kubenswrapper[4745]: E0319 00:36:39.049403 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c93cb13-815f-48ec-a316-a889a6717f7c" containerName="default-interconnect" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.049421 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c93cb13-815f-48ec-a316-a889a6717f7c" containerName="default-interconnect" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.049539 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c93cb13-815f-48ec-a316-a889a6717f7c" containerName="default-interconnect" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.050100 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.087097 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-wgvm9"] Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.130312 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-ca\") pod \"3c93cb13-815f-48ec-a316-a889a6717f7c\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.130530 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-users\") pod \"3c93cb13-815f-48ec-a316-a889a6717f7c\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.130585 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-ca\") pod \"3c93cb13-815f-48ec-a316-a889a6717f7c\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.130618 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-config\") pod \"3c93cb13-815f-48ec-a316-a889a6717f7c\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.130666 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-credentials\") pod \"3c93cb13-815f-48ec-a316-a889a6717f7c\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.130727 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29b62\" (UniqueName: \"kubernetes.io/projected/3c93cb13-815f-48ec-a316-a889a6717f7c-kube-api-access-29b62\") pod \"3c93cb13-815f-48ec-a316-a889a6717f7c\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.130779 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-credentials\") pod \"3c93cb13-815f-48ec-a316-a889a6717f7c\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.131681 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "3c93cb13-815f-48ec-a316-a889a6717f7c" (UID: "3c93cb13-815f-48ec-a316-a889a6717f7c"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.137188 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "3c93cb13-815f-48ec-a316-a889a6717f7c" (UID: "3c93cb13-815f-48ec-a316-a889a6717f7c"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.137548 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "3c93cb13-815f-48ec-a316-a889a6717f7c" (UID: "3c93cb13-815f-48ec-a316-a889a6717f7c"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.138274 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c93cb13-815f-48ec-a316-a889a6717f7c-kube-api-access-29b62" (OuterVolumeSpecName: "kube-api-access-29b62") pod "3c93cb13-815f-48ec-a316-a889a6717f7c" (UID: "3c93cb13-815f-48ec-a316-a889a6717f7c"). InnerVolumeSpecName "kube-api-access-29b62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.138729 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "3c93cb13-815f-48ec-a316-a889a6717f7c" (UID: "3c93cb13-815f-48ec-a316-a889a6717f7c"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.139184 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "3c93cb13-815f-48ec-a316-a889a6717f7c" (UID: "3c93cb13-815f-48ec-a316-a889a6717f7c"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.140928 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "3c93cb13-815f-48ec-a316-a889a6717f7c" (UID: "3c93cb13-815f-48ec-a316-a889a6717f7c"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.232383 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.232861 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.232905 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233051 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bkh9\" (UniqueName: \"kubernetes.io/projected/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-kube-api-access-9bkh9\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233129 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-sasl-users\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233163 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233203 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-sasl-config\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233295 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29b62\" (UniqueName: \"kubernetes.io/projected/3c93cb13-815f-48ec-a316-a889a6717f7c-kube-api-access-29b62\") on node \"crc\" DevicePath \"\"" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233308 4745 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233321 4745 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233332 4745 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-users\") on node \"crc\" DevicePath \"\"" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233342 4745 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233351 4745 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233362 4745 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.334402 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bkh9\" (UniqueName: \"kubernetes.io/projected/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-kube-api-access-9bkh9\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.334514 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-sasl-users\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.334550 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.334575 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-sasl-config\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.334678 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.334731 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.334752 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.335817 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-sasl-config\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.339342 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.341218 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.341604 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.347157 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-sasl-users\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.351760 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.353980 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bkh9\" (UniqueName: \"kubernetes.io/projected/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-kube-api-access-9bkh9\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.374306 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.682394 4745 generic.go:334] "Generic (PLEG): container finished" podID="aa12986d-ffa2-4a08-9069-77fc4fdd80c6" containerID="c09f6fd57b4510a6536ce6cede5c5d0507077eb14b1eeafd9132c5c0313f2a67" exitCode=0 Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.682550 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" event={"ID":"aa12986d-ffa2-4a08-9069-77fc4fdd80c6","Type":"ContainerDied","Data":"c09f6fd57b4510a6536ce6cede5c5d0507077eb14b1eeafd9132c5c0313f2a67"} Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.683567 4745 scope.go:117] "RemoveContainer" containerID="c09f6fd57b4510a6536ce6cede5c5d0507077eb14b1eeafd9132c5c0313f2a67" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.690415 4745 generic.go:334] "Generic (PLEG): container finished" podID="1d506c7b-246f-4aca-b3ac-635dbc53b579" containerID="cfb887723d4d194f268766af8ee4a172a3ee0c93bac5baa127b409ccb719c01a" exitCode=0 Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.690500 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" event={"ID":"1d506c7b-246f-4aca-b3ac-635dbc53b579","Type":"ContainerDied","Data":"cfb887723d4d194f268766af8ee4a172a3ee0c93bac5baa127b409ccb719c01a"} Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.691244 4745 scope.go:117] "RemoveContainer" containerID="cfb887723d4d194f268766af8ee4a172a3ee0c93bac5baa127b409ccb719c01a" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.691820 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-wgvm9"] Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.700407 4745 generic.go:334] "Generic (PLEG): container finished" podID="f88023b4-4c23-4946-a12b-3f0cdab93771" containerID="46eca11823ff9b357cffb37437e48eab6984b261c9db2831eaf601f907a099e2" exitCode=0 Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.700572 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" event={"ID":"f88023b4-4c23-4946-a12b-3f0cdab93771","Type":"ContainerDied","Data":"46eca11823ff9b357cffb37437e48eab6984b261c9db2831eaf601f907a099e2"} Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.701328 4745 scope.go:117] "RemoveContainer" containerID="46eca11823ff9b357cffb37437e48eab6984b261c9db2831eaf601f907a099e2" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.725702 4745 generic.go:334] "Generic (PLEG): container finished" podID="71276875-43be-4d09-a25d-4327369c3a53" containerID="d06419cf324315fc94fd28fa12b688584ca303ebd7cd1fabeb84325363ee24a7" exitCode=0 Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.726972 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" event={"ID":"71276875-43be-4d09-a25d-4327369c3a53","Type":"ContainerDied","Data":"d06419cf324315fc94fd28fa12b688584ca303ebd7cd1fabeb84325363ee24a7"} Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.729972 4745 scope.go:117] "RemoveContainer" containerID="d06419cf324315fc94fd28fa12b688584ca303ebd7cd1fabeb84325363ee24a7" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.745792 4745 generic.go:334] "Generic (PLEG): container finished" podID="651a6724-09fd-4395-859f-7fdff0781163" containerID="76c6559d5164fbe901447e34e84a6f2a925b8533c5eaa0bf05828b32a7fd485b" exitCode=0 Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.745927 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" event={"ID":"651a6724-09fd-4395-859f-7fdff0781163","Type":"ContainerDied","Data":"76c6559d5164fbe901447e34e84a6f2a925b8533c5eaa0bf05828b32a7fd485b"} Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.746592 4745 scope.go:117] "RemoveContainer" containerID="76c6559d5164fbe901447e34e84a6f2a925b8533c5eaa0bf05828b32a7fd485b" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.750771 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" event={"ID":"3c93cb13-815f-48ec-a316-a889a6717f7c","Type":"ContainerDied","Data":"dcfd75548cf8b7e9217b0b25a0988010bbd42dcd7ccfe6fcc357469ebb9d89b8"} Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.750817 4745 scope.go:117] "RemoveContainer" containerID="6b68238373df0047ebeb6c52001a21e2ba8abdae07e988851a91457251aa4b36" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.750951 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.923629 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-2x4w6"] Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.929753 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-2x4w6"] Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.150558 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c93cb13-815f-48ec-a316-a889a6717f7c" path="/var/lib/kubelet/pods/3c93cb13-815f-48ec-a316-a889a6717f7c/volumes" Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.762702 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" event={"ID":"f88023b4-4c23-4946-a12b-3f0cdab93771","Type":"ContainerStarted","Data":"6110402550d2e45a3d9817243885ca4b9910cf2a5b4ad8e1a58a9ed6c6bf7d17"} Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.786192 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" event={"ID":"b58f3451-d46e-43bc-8d65-cb9abbc9de0d","Type":"ContainerStarted","Data":"f32e002f0611eab40775ff09d1f7ff65af0d64799eba82d8809c019e2824ffe2"} Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.786259 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" event={"ID":"b58f3451-d46e-43bc-8d65-cb9abbc9de0d","Type":"ContainerStarted","Data":"66841ffae63083890e097a1f39b37e2b0215ed6f7b659978ae650f60a5f5f83c"} Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.791516 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" event={"ID":"71276875-43be-4d09-a25d-4327369c3a53","Type":"ContainerStarted","Data":"12e23535161647cd463afdcfec6ad96399fa5bb6532d42e03b75fc393b100ecf"} Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.796972 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" event={"ID":"651a6724-09fd-4395-859f-7fdff0781163","Type":"ContainerStarted","Data":"f29f4c0c2b57bfa2be0cd49debb6f1e3ddbea4941fa8e360cbda5af079fd4376"} Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.807118 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" event={"ID":"aa12986d-ffa2-4a08-9069-77fc4fdd80c6","Type":"ContainerStarted","Data":"386eaf76e4df5175ffae4ffdc6f171e30f796b0ffdf10c3221b9a1b9d24969d5"} Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.809104 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" event={"ID":"1d506c7b-246f-4aca-b3ac-635dbc53b579","Type":"ContainerStarted","Data":"155999ef7533a1bd0f394a36bb5200b2961a6d65dd5c2443e0fa2cc816f26527"} Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.834587 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" podStartSLOduration=3.834559741 podStartE2EDuration="3.834559741s" podCreationTimestamp="2026-03-19 00:36:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:36:40.82078679 +0000 UTC m=+1765.358981921" watchObservedRunningTime="2026-03-19 00:36:40.834559741 +0000 UTC m=+1765.372754872" Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.926803 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.929085 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.932455 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.932775 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.938240 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 19 00:36:41 crc kubenswrapper[4745]: I0319 00:36:41.068116 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lltwb\" (UniqueName: \"kubernetes.io/projected/9bc30d38-f540-480c-9289-45dbe7a4401b-kube-api-access-lltwb\") pod \"qdr-test\" (UID: \"9bc30d38-f540-480c-9289-45dbe7a4401b\") " pod="service-telemetry/qdr-test" Mar 19 00:36:41 crc kubenswrapper[4745]: I0319 00:36:41.068212 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/9bc30d38-f540-480c-9289-45dbe7a4401b-qdr-test-config\") pod \"qdr-test\" (UID: \"9bc30d38-f540-480c-9289-45dbe7a4401b\") " pod="service-telemetry/qdr-test" Mar 19 00:36:41 crc kubenswrapper[4745]: I0319 00:36:41.068370 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/9bc30d38-f540-480c-9289-45dbe7a4401b-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"9bc30d38-f540-480c-9289-45dbe7a4401b\") " pod="service-telemetry/qdr-test" Mar 19 00:36:41 crc kubenswrapper[4745]: I0319 00:36:41.170457 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lltwb\" (UniqueName: \"kubernetes.io/projected/9bc30d38-f540-480c-9289-45dbe7a4401b-kube-api-access-lltwb\") pod \"qdr-test\" (UID: \"9bc30d38-f540-480c-9289-45dbe7a4401b\") " pod="service-telemetry/qdr-test" Mar 19 00:36:41 crc kubenswrapper[4745]: I0319 00:36:41.170538 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/9bc30d38-f540-480c-9289-45dbe7a4401b-qdr-test-config\") pod \"qdr-test\" (UID: \"9bc30d38-f540-480c-9289-45dbe7a4401b\") " pod="service-telemetry/qdr-test" Mar 19 00:36:41 crc kubenswrapper[4745]: I0319 00:36:41.170609 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/9bc30d38-f540-480c-9289-45dbe7a4401b-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"9bc30d38-f540-480c-9289-45dbe7a4401b\") " pod="service-telemetry/qdr-test" Mar 19 00:36:41 crc kubenswrapper[4745]: I0319 00:36:41.171622 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/9bc30d38-f540-480c-9289-45dbe7a4401b-qdr-test-config\") pod \"qdr-test\" (UID: \"9bc30d38-f540-480c-9289-45dbe7a4401b\") " pod="service-telemetry/qdr-test" Mar 19 00:36:41 crc kubenswrapper[4745]: I0319 00:36:41.179061 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/9bc30d38-f540-480c-9289-45dbe7a4401b-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"9bc30d38-f540-480c-9289-45dbe7a4401b\") " pod="service-telemetry/qdr-test" Mar 19 00:36:41 crc kubenswrapper[4745]: I0319 00:36:41.190697 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lltwb\" (UniqueName: \"kubernetes.io/projected/9bc30d38-f540-480c-9289-45dbe7a4401b-kube-api-access-lltwb\") pod \"qdr-test\" (UID: \"9bc30d38-f540-480c-9289-45dbe7a4401b\") " pod="service-telemetry/qdr-test" Mar 19 00:36:41 crc kubenswrapper[4745]: I0319 00:36:41.438142 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 19 00:36:42 crc kubenswrapper[4745]: W0319 00:36:42.364804 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bc30d38_f540_480c_9289_45dbe7a4401b.slice/crio-baee122dd91d198c13e90105d156424a7feb3df81f806b411ca841f89e02c862 WatchSource:0}: Error finding container baee122dd91d198c13e90105d156424a7feb3df81f806b411ca841f89e02c862: Status 404 returned error can't find the container with id baee122dd91d198c13e90105d156424a7feb3df81f806b411ca841f89e02c862 Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.372715 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.878647 4745 generic.go:334] "Generic (PLEG): container finished" podID="f88023b4-4c23-4946-a12b-3f0cdab93771" containerID="6110402550d2e45a3d9817243885ca4b9910cf2a5b4ad8e1a58a9ed6c6bf7d17" exitCode=0 Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.878760 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" event={"ID":"f88023b4-4c23-4946-a12b-3f0cdab93771","Type":"ContainerDied","Data":"6110402550d2e45a3d9817243885ca4b9910cf2a5b4ad8e1a58a9ed6c6bf7d17"} Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.878827 4745 scope.go:117] "RemoveContainer" containerID="46eca11823ff9b357cffb37437e48eab6984b261c9db2831eaf601f907a099e2" Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.880620 4745 scope.go:117] "RemoveContainer" containerID="6110402550d2e45a3d9817243885ca4b9910cf2a5b4ad8e1a58a9ed6c6bf7d17" Mar 19 00:36:42 crc kubenswrapper[4745]: E0319 00:36:42.881040 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw_service-telemetry(f88023b4-4c23-4946-a12b-3f0cdab93771)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" podUID="f88023b4-4c23-4946-a12b-3f0cdab93771" Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.892943 4745 generic.go:334] "Generic (PLEG): container finished" podID="71276875-43be-4d09-a25d-4327369c3a53" containerID="12e23535161647cd463afdcfec6ad96399fa5bb6532d42e03b75fc393b100ecf" exitCode=0 Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.893013 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" event={"ID":"71276875-43be-4d09-a25d-4327369c3a53","Type":"ContainerDied","Data":"12e23535161647cd463afdcfec6ad96399fa5bb6532d42e03b75fc393b100ecf"} Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.893931 4745 scope.go:117] "RemoveContainer" containerID="12e23535161647cd463afdcfec6ad96399fa5bb6532d42e03b75fc393b100ecf" Mar 19 00:36:42 crc kubenswrapper[4745]: E0319 00:36:42.894197 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg_service-telemetry(71276875-43be-4d09-a25d-4327369c3a53)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" podUID="71276875-43be-4d09-a25d-4327369c3a53" Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.909572 4745 generic.go:334] "Generic (PLEG): container finished" podID="651a6724-09fd-4395-859f-7fdff0781163" containerID="f29f4c0c2b57bfa2be0cd49debb6f1e3ddbea4941fa8e360cbda5af079fd4376" exitCode=0 Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.910008 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" event={"ID":"651a6724-09fd-4395-859f-7fdff0781163","Type":"ContainerDied","Data":"f29f4c0c2b57bfa2be0cd49debb6f1e3ddbea4941fa8e360cbda5af079fd4376"} Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.910834 4745 scope.go:117] "RemoveContainer" containerID="f29f4c0c2b57bfa2be0cd49debb6f1e3ddbea4941fa8e360cbda5af079fd4376" Mar 19 00:36:42 crc kubenswrapper[4745]: E0319 00:36:42.911104 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl_service-telemetry(651a6724-09fd-4395-859f-7fdff0781163)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" podUID="651a6724-09fd-4395-859f-7fdff0781163" Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.911632 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"9bc30d38-f540-480c-9289-45dbe7a4401b","Type":"ContainerStarted","Data":"baee122dd91d198c13e90105d156424a7feb3df81f806b411ca841f89e02c862"} Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.913395 4745 generic.go:334] "Generic (PLEG): container finished" podID="aa12986d-ffa2-4a08-9069-77fc4fdd80c6" containerID="386eaf76e4df5175ffae4ffdc6f171e30f796b0ffdf10c3221b9a1b9d24969d5" exitCode=0 Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.913432 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" event={"ID":"aa12986d-ffa2-4a08-9069-77fc4fdd80c6","Type":"ContainerDied","Data":"386eaf76e4df5175ffae4ffdc6f171e30f796b0ffdf10c3221b9a1b9d24969d5"} Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.913785 4745 scope.go:117] "RemoveContainer" containerID="386eaf76e4df5175ffae4ffdc6f171e30f796b0ffdf10c3221b9a1b9d24969d5" Mar 19 00:36:42 crc kubenswrapper[4745]: E0319 00:36:42.914034 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6_service-telemetry(aa12986d-ffa2-4a08-9069-77fc4fdd80c6)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" podUID="aa12986d-ffa2-4a08-9069-77fc4fdd80c6" Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.922747 4745 generic.go:334] "Generic (PLEG): container finished" podID="1d506c7b-246f-4aca-b3ac-635dbc53b579" containerID="155999ef7533a1bd0f394a36bb5200b2961a6d65dd5c2443e0fa2cc816f26527" exitCode=0 Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.922817 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" event={"ID":"1d506c7b-246f-4aca-b3ac-635dbc53b579","Type":"ContainerDied","Data":"155999ef7533a1bd0f394a36bb5200b2961a6d65dd5c2443e0fa2cc816f26527"} Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.923555 4745 scope.go:117] "RemoveContainer" containerID="155999ef7533a1bd0f394a36bb5200b2961a6d65dd5c2443e0fa2cc816f26527" Mar 19 00:36:42 crc kubenswrapper[4745]: E0319 00:36:42.923837 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-7f44857896-xgs4c_service-telemetry(1d506c7b-246f-4aca-b3ac-635dbc53b579)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" podUID="1d506c7b-246f-4aca-b3ac-635dbc53b579" Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.965905 4745 scope.go:117] "RemoveContainer" containerID="d06419cf324315fc94fd28fa12b688584ca303ebd7cd1fabeb84325363ee24a7" Mar 19 00:36:43 crc kubenswrapper[4745]: I0319 00:36:43.941206 4745 scope.go:117] "RemoveContainer" containerID="76c6559d5164fbe901447e34e84a6f2a925b8533c5eaa0bf05828b32a7fd485b" Mar 19 00:36:44 crc kubenswrapper[4745]: I0319 00:36:44.767234 4745 scope.go:117] "RemoveContainer" containerID="c09f6fd57b4510a6536ce6cede5c5d0507077eb14b1eeafd9132c5c0313f2a67" Mar 19 00:36:44 crc kubenswrapper[4745]: I0319 00:36:44.811150 4745 scope.go:117] "RemoveContainer" containerID="cfb887723d4d194f268766af8ee4a172a3ee0c93bac5baa127b409ccb719c01a" Mar 19 00:36:52 crc kubenswrapper[4745]: I0319 00:36:52.138200 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:36:52 crc kubenswrapper[4745]: E0319 00:36:52.139280 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:36:54 crc kubenswrapper[4745]: I0319 00:36:54.138071 4745 scope.go:117] "RemoveContainer" containerID="6110402550d2e45a3d9817243885ca4b9910cf2a5b4ad8e1a58a9ed6c6bf7d17" Mar 19 00:36:55 crc kubenswrapper[4745]: I0319 00:36:55.138015 4745 scope.go:117] "RemoveContainer" containerID="155999ef7533a1bd0f394a36bb5200b2961a6d65dd5c2443e0fa2cc816f26527" Mar 19 00:36:56 crc kubenswrapper[4745]: I0319 00:36:56.142576 4745 scope.go:117] "RemoveContainer" containerID="f29f4c0c2b57bfa2be0cd49debb6f1e3ddbea4941fa8e360cbda5af079fd4376" Mar 19 00:36:56 crc kubenswrapper[4745]: I0319 00:36:56.143364 4745 scope.go:117] "RemoveContainer" containerID="12e23535161647cd463afdcfec6ad96399fa5bb6532d42e03b75fc393b100ecf" Mar 19 00:36:57 crc kubenswrapper[4745]: I0319 00:36:57.139322 4745 scope.go:117] "RemoveContainer" containerID="386eaf76e4df5175ffae4ffdc6f171e30f796b0ffdf10c3221b9a1b9d24969d5" Mar 19 00:36:58 crc kubenswrapper[4745]: E0319 00:36:58.471713 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/tripleowallabycentos9/openstack-qdrouterd:current-tripleo" Mar 19 00:36:58 crc kubenswrapper[4745]: E0319 00:36:58.472335 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:qdr,Image:quay.io/tripleowallabycentos9/openstack-qdrouterd:current-tripleo,Command:[/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:amqp,HostPort:0,ContainerPort:5672,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:default-interconnect-selfsigned-cert,ReadOnly:false,MountPath:/etc/pki/tls/certs/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:qdr-test-config,ReadOnly:false,MountPath:/etc/qpid-dispatch/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lltwb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod qdr-test_service-telemetry(9bc30d38-f540-480c-9289-45dbe7a4401b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 00:36:58 crc kubenswrapper[4745]: E0319 00:36:58.473876 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"qdr\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/qdr-test" podUID="9bc30d38-f540-480c-9289-45dbe7a4401b" Mar 19 00:36:59 crc kubenswrapper[4745]: I0319 00:36:59.441584 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" event={"ID":"71276875-43be-4d09-a25d-4327369c3a53","Type":"ContainerStarted","Data":"8585c2a55e2f852cccafc303b1e6bd5e28e4d00f6ce8d9a1261afbc1fe83f17c"} Mar 19 00:36:59 crc kubenswrapper[4745]: I0319 00:36:59.444873 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" event={"ID":"651a6724-09fd-4395-859f-7fdff0781163","Type":"ContainerStarted","Data":"50de5a905defabe2101a54ad4ef7569e539363ead6bc9ab9a65fd3c979cad523"} Mar 19 00:36:59 crc kubenswrapper[4745]: I0319 00:36:59.448261 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" event={"ID":"aa12986d-ffa2-4a08-9069-77fc4fdd80c6","Type":"ContainerStarted","Data":"24b1d3c893b308ebe909fade6a32dbc0ac0c99873d1cbd61e4d0b4f1dcd4b1c4"} Mar 19 00:36:59 crc kubenswrapper[4745]: I0319 00:36:59.451302 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" event={"ID":"1d506c7b-246f-4aca-b3ac-635dbc53b579","Type":"ContainerStarted","Data":"628e3e976e83bbd945045b092fe5512fde789095faa7e3dddf38c9f347dd4847"} Mar 19 00:36:59 crc kubenswrapper[4745]: I0319 00:36:59.454419 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" event={"ID":"f88023b4-4c23-4946-a12b-3f0cdab93771","Type":"ContainerStarted","Data":"34fe18c9b12f936b1b91f114837dc9407930c4bd386f12593e5324d4dbb29032"} Mar 19 00:36:59 crc kubenswrapper[4745]: E0319 00:36:59.458962 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"qdr\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/tripleowallabycentos9/openstack-qdrouterd:current-tripleo\\\"\"" pod="service-telemetry/qdr-test" podUID="9bc30d38-f540-480c-9289-45dbe7a4401b" Mar 19 00:37:06 crc kubenswrapper[4745]: I0319 00:37:06.143020 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:37:06 crc kubenswrapper[4745]: E0319 00:37:06.144162 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.381012 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-9prq4"] Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.391435 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.400012 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.400121 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.400449 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.400777 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.401085 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-9prq4"] Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.403141 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.403141 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.562857 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-config\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.563276 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.563411 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-healthcheck-log\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.563511 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.563604 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-492lj\" (UniqueName: \"kubernetes.io/projected/7de37e62-8066-4f02-85a3-4490078b4007-kube-api-access-492lj\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.563694 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-sensubility-config\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.563786 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-publisher\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.666999 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.667183 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-healthcheck-log\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.667256 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.667311 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-492lj\" (UniqueName: \"kubernetes.io/projected/7de37e62-8066-4f02-85a3-4490078b4007-kube-api-access-492lj\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.667400 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-sensubility-config\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.667466 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-publisher\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.667646 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-config\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.669020 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-config\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.669123 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-healthcheck-log\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.669200 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-sensubility-config\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.669565 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.669945 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-publisher\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.670210 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.694052 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-492lj\" (UniqueName: \"kubernetes.io/projected/7de37e62-8066-4f02-85a3-4490078b4007-kube-api-access-492lj\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.723402 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.727445 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.731694 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.735336 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.871172 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57qgf\" (UniqueName: \"kubernetes.io/projected/57309d44-0759-4b3b-954f-8253b2f8a0b3-kube-api-access-57qgf\") pod \"curl\" (UID: \"57309d44-0759-4b3b-954f-8253b2f8a0b3\") " pod="service-telemetry/curl" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.973308 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57qgf\" (UniqueName: \"kubernetes.io/projected/57309d44-0759-4b3b-954f-8253b2f8a0b3-kube-api-access-57qgf\") pod \"curl\" (UID: \"57309d44-0759-4b3b-954f-8253b2f8a0b3\") " pod="service-telemetry/curl" Mar 19 00:37:12 crc kubenswrapper[4745]: I0319 00:37:12.001082 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57qgf\" (UniqueName: \"kubernetes.io/projected/57309d44-0759-4b3b-954f-8253b2f8a0b3-kube-api-access-57qgf\") pod \"curl\" (UID: \"57309d44-0759-4b3b-954f-8253b2f8a0b3\") " pod="service-telemetry/curl" Mar 19 00:37:12 crc kubenswrapper[4745]: I0319 00:37:12.127352 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 19 00:37:12 crc kubenswrapper[4745]: I0319 00:37:12.291004 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-9prq4"] Mar 19 00:37:12 crc kubenswrapper[4745]: W0319 00:37:12.321684 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7de37e62_8066_4f02_85a3_4490078b4007.slice/crio-e7f5113060a08f8b614b385962510239062de48487e344ad4eb895bea1703184 WatchSource:0}: Error finding container e7f5113060a08f8b614b385962510239062de48487e344ad4eb895bea1703184: Status 404 returned error can't find the container with id e7f5113060a08f8b614b385962510239062de48487e344ad4eb895bea1703184 Mar 19 00:37:12 crc kubenswrapper[4745]: I0319 00:37:12.533815 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 19 00:37:12 crc kubenswrapper[4745]: I0319 00:37:12.565665 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-9prq4" event={"ID":"7de37e62-8066-4f02-85a3-4490078b4007","Type":"ContainerStarted","Data":"e7f5113060a08f8b614b385962510239062de48487e344ad4eb895bea1703184"} Mar 19 00:37:12 crc kubenswrapper[4745]: I0319 00:37:12.567781 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"57309d44-0759-4b3b-954f-8253b2f8a0b3","Type":"ContainerStarted","Data":"739c75b2e0182c90e96d768d71238aef55c96dda097b3230911bab9794c28f38"} Mar 19 00:37:12 crc kubenswrapper[4745]: I0319 00:37:12.569731 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"9bc30d38-f540-480c-9289-45dbe7a4401b","Type":"ContainerStarted","Data":"39c63c6437ed94b284a0ebbfdee9d766b5057b6cdd46293b5d9dd99cbb801f7c"} Mar 19 00:37:12 crc kubenswrapper[4745]: I0319 00:37:12.597256 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=3.338983274 podStartE2EDuration="32.597226327s" podCreationTimestamp="2026-03-19 00:36:40 +0000 UTC" firstStartedPulling="2026-03-19 00:36:42.367944598 +0000 UTC m=+1766.906139729" lastFinishedPulling="2026-03-19 00:37:11.626187651 +0000 UTC m=+1796.164382782" observedRunningTime="2026-03-19 00:37:12.592266702 +0000 UTC m=+1797.130461833" watchObservedRunningTime="2026-03-19 00:37:12.597226327 +0000 UTC m=+1797.135421458" Mar 19 00:37:15 crc kubenswrapper[4745]: I0319 00:37:15.618548 4745 generic.go:334] "Generic (PLEG): container finished" podID="57309d44-0759-4b3b-954f-8253b2f8a0b3" containerID="12774089bafc3207d1c8eed8db8ca60cf7f5b505e9b2c47330a0acdcf61fbf8d" exitCode=0 Mar 19 00:37:15 crc kubenswrapper[4745]: I0319 00:37:15.618672 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"57309d44-0759-4b3b-954f-8253b2f8a0b3","Type":"ContainerDied","Data":"12774089bafc3207d1c8eed8db8ca60cf7f5b505e9b2c47330a0acdcf61fbf8d"} Mar 19 00:37:21 crc kubenswrapper[4745]: I0319 00:37:21.140850 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:37:21 crc kubenswrapper[4745]: E0319 00:37:21.142174 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:37:22 crc kubenswrapper[4745]: I0319 00:37:22.982073 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 19 00:37:23 crc kubenswrapper[4745]: I0319 00:37:23.050684 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57qgf\" (UniqueName: \"kubernetes.io/projected/57309d44-0759-4b3b-954f-8253b2f8a0b3-kube-api-access-57qgf\") pod \"57309d44-0759-4b3b-954f-8253b2f8a0b3\" (UID: \"57309d44-0759-4b3b-954f-8253b2f8a0b3\") " Mar 19 00:37:23 crc kubenswrapper[4745]: I0319 00:37:23.065451 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57309d44-0759-4b3b-954f-8253b2f8a0b3-kube-api-access-57qgf" (OuterVolumeSpecName: "kube-api-access-57qgf") pod "57309d44-0759-4b3b-954f-8253b2f8a0b3" (UID: "57309d44-0759-4b3b-954f-8253b2f8a0b3"). InnerVolumeSpecName "kube-api-access-57qgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:37:23 crc kubenswrapper[4745]: I0319 00:37:23.153695 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57qgf\" (UniqueName: \"kubernetes.io/projected/57309d44-0759-4b3b-954f-8253b2f8a0b3-kube-api-access-57qgf\") on node \"crc\" DevicePath \"\"" Mar 19 00:37:23 crc kubenswrapper[4745]: I0319 00:37:23.181098 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_57309d44-0759-4b3b-954f-8253b2f8a0b3/curl/0.log" Mar 19 00:37:23 crc kubenswrapper[4745]: I0319 00:37:23.453014 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-4pxj4_7ed8bcc5-1389-4cff-b64c-bf3b813f642e/prometheus-webhook-snmp/0.log" Mar 19 00:37:23 crc kubenswrapper[4745]: I0319 00:37:23.690045 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"57309d44-0759-4b3b-954f-8253b2f8a0b3","Type":"ContainerDied","Data":"739c75b2e0182c90e96d768d71238aef55c96dda097b3230911bab9794c28f38"} Mar 19 00:37:23 crc kubenswrapper[4745]: I0319 00:37:23.690109 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="739c75b2e0182c90e96d768d71238aef55c96dda097b3230911bab9794c28f38" Mar 19 00:37:23 crc kubenswrapper[4745]: I0319 00:37:23.690146 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 19 00:37:25 crc kubenswrapper[4745]: I0319 00:37:25.714903 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-9prq4" event={"ID":"7de37e62-8066-4f02-85a3-4490078b4007","Type":"ContainerStarted","Data":"48caa8f3b4d5b8c1192e96bf757f22c649d20960c122f3e277ed9d83d18d42d7"} Mar 19 00:37:33 crc kubenswrapper[4745]: I0319 00:37:33.138142 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:37:33 crc kubenswrapper[4745]: E0319 00:37:33.139153 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:37:33 crc kubenswrapper[4745]: I0319 00:37:33.796767 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-9prq4" event={"ID":"7de37e62-8066-4f02-85a3-4490078b4007","Type":"ContainerStarted","Data":"2bde59b3410df5f3ca3524d81967b5360d24635e0a7e3f093cd6e92af7b10a42"} Mar 19 00:37:33 crc kubenswrapper[4745]: I0319 00:37:33.821780 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-9prq4" podStartSLOduration=2.40646354 podStartE2EDuration="22.821754961s" podCreationTimestamp="2026-03-19 00:37:11 +0000 UTC" firstStartedPulling="2026-03-19 00:37:12.324132166 +0000 UTC m=+1796.862327307" lastFinishedPulling="2026-03-19 00:37:32.739423597 +0000 UTC m=+1817.277618728" observedRunningTime="2026-03-19 00:37:33.814037869 +0000 UTC m=+1818.352233020" watchObservedRunningTime="2026-03-19 00:37:33.821754961 +0000 UTC m=+1818.359950092" Mar 19 00:37:46 crc kubenswrapper[4745]: I0319 00:37:46.143706 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:37:46 crc kubenswrapper[4745]: E0319 00:37:46.144909 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:37:53 crc kubenswrapper[4745]: I0319 00:37:53.624078 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-4pxj4_7ed8bcc5-1389-4cff-b64c-bf3b813f642e/prometheus-webhook-snmp/0.log" Mar 19 00:37:58 crc kubenswrapper[4745]: I0319 00:37:58.996869 4745 generic.go:334] "Generic (PLEG): container finished" podID="7de37e62-8066-4f02-85a3-4490078b4007" containerID="48caa8f3b4d5b8c1192e96bf757f22c649d20960c122f3e277ed9d83d18d42d7" exitCode=0 Mar 19 00:37:58 crc kubenswrapper[4745]: I0319 00:37:58.997103 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-9prq4" event={"ID":"7de37e62-8066-4f02-85a3-4490078b4007","Type":"ContainerDied","Data":"48caa8f3b4d5b8c1192e96bf757f22c649d20960c122f3e277ed9d83d18d42d7"} Mar 19 00:37:59 crc kubenswrapper[4745]: I0319 00:37:58.998197 4745 scope.go:117] "RemoveContainer" containerID="48caa8f3b4d5b8c1192e96bf757f22c649d20960c122f3e277ed9d83d18d42d7" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.147926 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564678-m5hsl"] Mar 19 00:38:00 crc kubenswrapper[4745]: E0319 00:38:00.148265 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57309d44-0759-4b3b-954f-8253b2f8a0b3" containerName="curl" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.148284 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="57309d44-0759-4b3b-954f-8253b2f8a0b3" containerName="curl" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.148469 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="57309d44-0759-4b3b-954f-8253b2f8a0b3" containerName="curl" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.149148 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564678-m5hsl" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.151208 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564678-m5hsl"] Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.152350 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.152638 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.157607 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v48z6\" (UniqueName: \"kubernetes.io/projected/c3ff29f0-5d28-4572-bd21-aac2f86091a8-kube-api-access-v48z6\") pod \"auto-csr-approver-29564678-m5hsl\" (UID: \"c3ff29f0-5d28-4572-bd21-aac2f86091a8\") " pod="openshift-infra/auto-csr-approver-29564678-m5hsl" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.161411 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.260278 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v48z6\" (UniqueName: \"kubernetes.io/projected/c3ff29f0-5d28-4572-bd21-aac2f86091a8-kube-api-access-v48z6\") pod \"auto-csr-approver-29564678-m5hsl\" (UID: \"c3ff29f0-5d28-4572-bd21-aac2f86091a8\") " pod="openshift-infra/auto-csr-approver-29564678-m5hsl" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.285691 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v48z6\" (UniqueName: \"kubernetes.io/projected/c3ff29f0-5d28-4572-bd21-aac2f86091a8-kube-api-access-v48z6\") pod \"auto-csr-approver-29564678-m5hsl\" (UID: \"c3ff29f0-5d28-4572-bd21-aac2f86091a8\") " pod="openshift-infra/auto-csr-approver-29564678-m5hsl" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.487253 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564678-m5hsl" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.720779 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564678-m5hsl"] Mar 19 00:38:01 crc kubenswrapper[4745]: I0319 00:38:01.014135 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564678-m5hsl" event={"ID":"c3ff29f0-5d28-4572-bd21-aac2f86091a8","Type":"ContainerStarted","Data":"4fe4236b2795b4676a30cbdd0f69a1604ccf5cd49f0c60a5899e82082e2165de"} Mar 19 00:38:01 crc kubenswrapper[4745]: I0319 00:38:01.138679 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:38:01 crc kubenswrapper[4745]: E0319 00:38:01.138955 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:38:02 crc kubenswrapper[4745]: I0319 00:38:02.026666 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564678-m5hsl" event={"ID":"c3ff29f0-5d28-4572-bd21-aac2f86091a8","Type":"ContainerStarted","Data":"97740a413cb8dbbe5a9163cc3b0f901c899139f75274026549c0e4cc3732d413"} Mar 19 00:38:02 crc kubenswrapper[4745]: I0319 00:38:02.044621 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564678-m5hsl" podStartSLOduration=1.076213796 podStartE2EDuration="2.04459914s" podCreationTimestamp="2026-03-19 00:38:00 +0000 UTC" firstStartedPulling="2026-03-19 00:38:00.726752542 +0000 UTC m=+1845.264947673" lastFinishedPulling="2026-03-19 00:38:01.695137886 +0000 UTC m=+1846.233333017" observedRunningTime="2026-03-19 00:38:02.041815033 +0000 UTC m=+1846.580010184" watchObservedRunningTime="2026-03-19 00:38:02.04459914 +0000 UTC m=+1846.582794271" Mar 19 00:38:03 crc kubenswrapper[4745]: I0319 00:38:03.036519 4745 generic.go:334] "Generic (PLEG): container finished" podID="c3ff29f0-5d28-4572-bd21-aac2f86091a8" containerID="97740a413cb8dbbe5a9163cc3b0f901c899139f75274026549c0e4cc3732d413" exitCode=0 Mar 19 00:38:03 crc kubenswrapper[4745]: I0319 00:38:03.036584 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564678-m5hsl" event={"ID":"c3ff29f0-5d28-4572-bd21-aac2f86091a8","Type":"ContainerDied","Data":"97740a413cb8dbbe5a9163cc3b0f901c899139f75274026549c0e4cc3732d413"} Mar 19 00:38:04 crc kubenswrapper[4745]: I0319 00:38:04.292233 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564678-m5hsl" Mar 19 00:38:04 crc kubenswrapper[4745]: I0319 00:38:04.332563 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v48z6\" (UniqueName: \"kubernetes.io/projected/c3ff29f0-5d28-4572-bd21-aac2f86091a8-kube-api-access-v48z6\") pod \"c3ff29f0-5d28-4572-bd21-aac2f86091a8\" (UID: \"c3ff29f0-5d28-4572-bd21-aac2f86091a8\") " Mar 19 00:38:04 crc kubenswrapper[4745]: I0319 00:38:04.338797 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3ff29f0-5d28-4572-bd21-aac2f86091a8-kube-api-access-v48z6" (OuterVolumeSpecName: "kube-api-access-v48z6") pod "c3ff29f0-5d28-4572-bd21-aac2f86091a8" (UID: "c3ff29f0-5d28-4572-bd21-aac2f86091a8"). InnerVolumeSpecName "kube-api-access-v48z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:38:04 crc kubenswrapper[4745]: I0319 00:38:04.434627 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v48z6\" (UniqueName: \"kubernetes.io/projected/c3ff29f0-5d28-4572-bd21-aac2f86091a8-kube-api-access-v48z6\") on node \"crc\" DevicePath \"\"" Mar 19 00:38:05 crc kubenswrapper[4745]: I0319 00:38:05.055147 4745 generic.go:334] "Generic (PLEG): container finished" podID="7de37e62-8066-4f02-85a3-4490078b4007" containerID="2bde59b3410df5f3ca3524d81967b5360d24635e0a7e3f093cd6e92af7b10a42" exitCode=0 Mar 19 00:38:05 crc kubenswrapper[4745]: I0319 00:38:05.055253 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-9prq4" event={"ID":"7de37e62-8066-4f02-85a3-4490078b4007","Type":"ContainerDied","Data":"2bde59b3410df5f3ca3524d81967b5360d24635e0a7e3f093cd6e92af7b10a42"} Mar 19 00:38:05 crc kubenswrapper[4745]: I0319 00:38:05.057599 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564678-m5hsl" event={"ID":"c3ff29f0-5d28-4572-bd21-aac2f86091a8","Type":"ContainerDied","Data":"4fe4236b2795b4676a30cbdd0f69a1604ccf5cd49f0c60a5899e82082e2165de"} Mar 19 00:38:05 crc kubenswrapper[4745]: I0319 00:38:05.057634 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fe4236b2795b4676a30cbdd0f69a1604ccf5cd49f0c60a5899e82082e2165de" Mar 19 00:38:05 crc kubenswrapper[4745]: I0319 00:38:05.057707 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564678-m5hsl" Mar 19 00:38:05 crc kubenswrapper[4745]: I0319 00:38:05.115821 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564672-tpcgv"] Mar 19 00:38:05 crc kubenswrapper[4745]: I0319 00:38:05.124245 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564672-tpcgv"] Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.152659 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="550c50ae-5519-4c0d-b2b0-7415d134808f" path="/var/lib/kubelet/pods/550c50ae-5519-4c0d-b2b0-7415d134808f/volumes" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.331058 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.385291 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-entrypoint-script\") pod \"7de37e62-8066-4f02-85a3-4490078b4007\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.385428 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-sensubility-config\") pod \"7de37e62-8066-4f02-85a3-4490078b4007\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.385480 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-config\") pod \"7de37e62-8066-4f02-85a3-4490078b4007\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.385564 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-publisher\") pod \"7de37e62-8066-4f02-85a3-4490078b4007\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.385752 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-492lj\" (UniqueName: \"kubernetes.io/projected/7de37e62-8066-4f02-85a3-4490078b4007-kube-api-access-492lj\") pod \"7de37e62-8066-4f02-85a3-4490078b4007\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.385814 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-healthcheck-log\") pod \"7de37e62-8066-4f02-85a3-4490078b4007\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.385850 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-entrypoint-script\") pod \"7de37e62-8066-4f02-85a3-4490078b4007\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.410804 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "7de37e62-8066-4f02-85a3-4490078b4007" (UID: "7de37e62-8066-4f02-85a3-4490078b4007"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.412029 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "7de37e62-8066-4f02-85a3-4490078b4007" (UID: "7de37e62-8066-4f02-85a3-4490078b4007"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.412734 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "7de37e62-8066-4f02-85a3-4490078b4007" (UID: "7de37e62-8066-4f02-85a3-4490078b4007"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.414379 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7de37e62-8066-4f02-85a3-4490078b4007-kube-api-access-492lj" (OuterVolumeSpecName: "kube-api-access-492lj") pod "7de37e62-8066-4f02-85a3-4490078b4007" (UID: "7de37e62-8066-4f02-85a3-4490078b4007"). InnerVolumeSpecName "kube-api-access-492lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.415702 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "7de37e62-8066-4f02-85a3-4490078b4007" (UID: "7de37e62-8066-4f02-85a3-4490078b4007"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.416013 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "7de37e62-8066-4f02-85a3-4490078b4007" (UID: "7de37e62-8066-4f02-85a3-4490078b4007"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.425853 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "7de37e62-8066-4f02-85a3-4490078b4007" (UID: "7de37e62-8066-4f02-85a3-4490078b4007"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.488411 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-492lj\" (UniqueName: \"kubernetes.io/projected/7de37e62-8066-4f02-85a3-4490078b4007-kube-api-access-492lj\") on node \"crc\" DevicePath \"\"" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.488469 4745 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-healthcheck-log\") on node \"crc\" DevicePath \"\"" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.488480 4745 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.488489 4745 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.488499 4745 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-sensubility-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.488510 4745 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.488519 4745 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Mar 19 00:38:07 crc kubenswrapper[4745]: I0319 00:38:07.078297 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-9prq4" event={"ID":"7de37e62-8066-4f02-85a3-4490078b4007","Type":"ContainerDied","Data":"e7f5113060a08f8b614b385962510239062de48487e344ad4eb895bea1703184"} Mar 19 00:38:07 crc kubenswrapper[4745]: I0319 00:38:07.078763 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7f5113060a08f8b614b385962510239062de48487e344ad4eb895bea1703184" Mar 19 00:38:07 crc kubenswrapper[4745]: I0319 00:38:07.078435 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:38:08 crc kubenswrapper[4745]: I0319 00:38:08.280969 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-9prq4_7de37e62-8066-4f02-85a3-4490078b4007/smoketest-collectd/0.log" Mar 19 00:38:08 crc kubenswrapper[4745]: I0319 00:38:08.530434 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-9prq4_7de37e62-8066-4f02-85a3-4490078b4007/smoketest-ceilometer/0.log" Mar 19 00:38:08 crc kubenswrapper[4745]: I0319 00:38:08.766781 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-wgvm9_b58f3451-d46e-43bc-8d65-cb9abbc9de0d/default-interconnect/0.log" Mar 19 00:38:09 crc kubenswrapper[4745]: I0319 00:38:09.001828 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg_71276875-43be-4d09-a25d-4327369c3a53/bridge/2.log" Mar 19 00:38:09 crc kubenswrapper[4745]: I0319 00:38:09.290377 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg_71276875-43be-4d09-a25d-4327369c3a53/sg-core/0.log" Mar 19 00:38:09 crc kubenswrapper[4745]: I0319 00:38:09.558222 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-7f44857896-xgs4c_1d506c7b-246f-4aca-b3ac-635dbc53b579/bridge/2.log" Mar 19 00:38:09 crc kubenswrapper[4745]: I0319 00:38:09.775096 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-7f44857896-xgs4c_1d506c7b-246f-4aca-b3ac-635dbc53b579/sg-core/0.log" Mar 19 00:38:10 crc kubenswrapper[4745]: I0319 00:38:10.062700 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6_aa12986d-ffa2-4a08-9069-77fc4fdd80c6/bridge/2.log" Mar 19 00:38:10 crc kubenswrapper[4745]: I0319 00:38:10.294645 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6_aa12986d-ffa2-4a08-9069-77fc4fdd80c6/sg-core/0.log" Mar 19 00:38:10 crc kubenswrapper[4745]: I0319 00:38:10.527339 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw_f88023b4-4c23-4946-a12b-3f0cdab93771/bridge/2.log" Mar 19 00:38:10 crc kubenswrapper[4745]: I0319 00:38:10.796572 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw_f88023b4-4c23-4946-a12b-3f0cdab93771/sg-core/0.log" Mar 19 00:38:11 crc kubenswrapper[4745]: I0319 00:38:11.104165 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl_651a6724-09fd-4395-859f-7fdff0781163/bridge/2.log" Mar 19 00:38:11 crc kubenswrapper[4745]: I0319 00:38:11.376667 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl_651a6724-09fd-4395-859f-7fdff0781163/sg-core/0.log" Mar 19 00:38:15 crc kubenswrapper[4745]: I0319 00:38:15.137601 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:38:15 crc kubenswrapper[4745]: E0319 00:38:15.138277 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:38:15 crc kubenswrapper[4745]: I0319 00:38:15.484723 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-ff5d8cc8d-tcxb4_a8895e98-6a3f-4f8a-b671-9a7920ceb390/operator/0.log" Mar 19 00:38:15 crc kubenswrapper[4745]: I0319 00:38:15.767055 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_7bba4496-9224-467b-80ca-ff25c39604ec/prometheus/0.log" Mar 19 00:38:16 crc kubenswrapper[4745]: I0319 00:38:16.029665 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_884040c3-6c56-45b0-881d-e73f52c0ab34/elasticsearch/0.log" Mar 19 00:38:16 crc kubenswrapper[4745]: I0319 00:38:16.291437 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-4pxj4_7ed8bcc5-1389-4cff-b64c-bf3b813f642e/prometheus-webhook-snmp/0.log" Mar 19 00:38:16 crc kubenswrapper[4745]: I0319 00:38:16.582138 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7/alertmanager/0.log" Mar 19 00:38:29 crc kubenswrapper[4745]: I0319 00:38:29.138499 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:38:29 crc kubenswrapper[4745]: E0319 00:38:29.139598 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:38:33 crc kubenswrapper[4745]: I0319 00:38:33.973402 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-c87c48cb6-d4c8j_4754eb2f-bab5-413c-ab43-3b9142082c2f/operator/0.log" Mar 19 00:38:35 crc kubenswrapper[4745]: I0319 00:38:35.878577 4745 scope.go:117] "RemoveContainer" containerID="ad4d25cdb1eb7abf3f1713a4b642271a43e4a8fa68c0fb36024884e82f682adb" Mar 19 00:38:35 crc kubenswrapper[4745]: I0319 00:38:35.934112 4745 scope.go:117] "RemoveContainer" containerID="f313fbb3c21be61114f257490bb0a77393588572276eeaf994f032d21e90ad1a" Mar 19 00:38:35 crc kubenswrapper[4745]: I0319 00:38:35.982684 4745 scope.go:117] "RemoveContainer" containerID="974a4b2b0120bf7547c402b35bf7ecab55db0da6f49394541dc6bc7af4cdda92" Mar 19 00:38:37 crc kubenswrapper[4745]: I0319 00:38:37.769932 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-ff5d8cc8d-tcxb4_a8895e98-6a3f-4f8a-b671-9a7920ceb390/operator/0.log" Mar 19 00:38:38 crc kubenswrapper[4745]: I0319 00:38:38.025855 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_9bc30d38-f540-480c-9289-45dbe7a4401b/qdr/0.log" Mar 19 00:38:43 crc kubenswrapper[4745]: I0319 00:38:43.138469 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:38:43 crc kubenswrapper[4745]: E0319 00:38:43.139649 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:38:56 crc kubenswrapper[4745]: I0319 00:38:56.143754 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:38:56 crc kubenswrapper[4745]: I0319 00:38:56.937361 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"12dd2a3323471bce1c5eddd5688e4d60bfdf81298a99430d997466c1833e7d01"} Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.931114 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-45wdk/must-gather-j98n7"] Mar 19 00:39:17 crc kubenswrapper[4745]: E0319 00:39:17.932159 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3ff29f0-5d28-4572-bd21-aac2f86091a8" containerName="oc" Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.932176 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ff29f0-5d28-4572-bd21-aac2f86091a8" containerName="oc" Mar 19 00:39:17 crc kubenswrapper[4745]: E0319 00:39:17.932197 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7de37e62-8066-4f02-85a3-4490078b4007" containerName="smoketest-ceilometer" Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.932204 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7de37e62-8066-4f02-85a3-4490078b4007" containerName="smoketest-ceilometer" Mar 19 00:39:17 crc kubenswrapper[4745]: E0319 00:39:17.932218 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7de37e62-8066-4f02-85a3-4490078b4007" containerName="smoketest-collectd" Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.932228 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7de37e62-8066-4f02-85a3-4490078b4007" containerName="smoketest-collectd" Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.932385 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="7de37e62-8066-4f02-85a3-4490078b4007" containerName="smoketest-ceilometer" Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.932399 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3ff29f0-5d28-4572-bd21-aac2f86091a8" containerName="oc" Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.932408 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="7de37e62-8066-4f02-85a3-4490078b4007" containerName="smoketest-collectd" Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.933175 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-45wdk/must-gather-j98n7" Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.937361 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-45wdk"/"kube-root-ca.crt" Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.937418 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-45wdk"/"openshift-service-ca.crt" Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.946393 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-45wdk/must-gather-j98n7"] Mar 19 00:39:18 crc kubenswrapper[4745]: I0319 00:39:18.073751 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncd7b\" (UniqueName: \"kubernetes.io/projected/08615e98-17b3-40c9-8b9b-e372a9ca1b04-kube-api-access-ncd7b\") pod \"must-gather-j98n7\" (UID: \"08615e98-17b3-40c9-8b9b-e372a9ca1b04\") " pod="openshift-must-gather-45wdk/must-gather-j98n7" Mar 19 00:39:18 crc kubenswrapper[4745]: I0319 00:39:18.073865 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/08615e98-17b3-40c9-8b9b-e372a9ca1b04-must-gather-output\") pod \"must-gather-j98n7\" (UID: \"08615e98-17b3-40c9-8b9b-e372a9ca1b04\") " pod="openshift-must-gather-45wdk/must-gather-j98n7" Mar 19 00:39:18 crc kubenswrapper[4745]: I0319 00:39:18.175675 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncd7b\" (UniqueName: \"kubernetes.io/projected/08615e98-17b3-40c9-8b9b-e372a9ca1b04-kube-api-access-ncd7b\") pod \"must-gather-j98n7\" (UID: \"08615e98-17b3-40c9-8b9b-e372a9ca1b04\") " pod="openshift-must-gather-45wdk/must-gather-j98n7" Mar 19 00:39:18 crc kubenswrapper[4745]: I0319 00:39:18.175754 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/08615e98-17b3-40c9-8b9b-e372a9ca1b04-must-gather-output\") pod \"must-gather-j98n7\" (UID: \"08615e98-17b3-40c9-8b9b-e372a9ca1b04\") " pod="openshift-must-gather-45wdk/must-gather-j98n7" Mar 19 00:39:18 crc kubenswrapper[4745]: I0319 00:39:18.176479 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/08615e98-17b3-40c9-8b9b-e372a9ca1b04-must-gather-output\") pod \"must-gather-j98n7\" (UID: \"08615e98-17b3-40c9-8b9b-e372a9ca1b04\") " pod="openshift-must-gather-45wdk/must-gather-j98n7" Mar 19 00:39:18 crc kubenswrapper[4745]: I0319 00:39:18.202809 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncd7b\" (UniqueName: \"kubernetes.io/projected/08615e98-17b3-40c9-8b9b-e372a9ca1b04-kube-api-access-ncd7b\") pod \"must-gather-j98n7\" (UID: \"08615e98-17b3-40c9-8b9b-e372a9ca1b04\") " pod="openshift-must-gather-45wdk/must-gather-j98n7" Mar 19 00:39:18 crc kubenswrapper[4745]: I0319 00:39:18.255605 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-45wdk/must-gather-j98n7" Mar 19 00:39:18 crc kubenswrapper[4745]: I0319 00:39:18.859520 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-45wdk/must-gather-j98n7"] Mar 19 00:39:18 crc kubenswrapper[4745]: I0319 00:39:18.868598 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 00:39:19 crc kubenswrapper[4745]: I0319 00:39:19.129364 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-45wdk/must-gather-j98n7" event={"ID":"08615e98-17b3-40c9-8b9b-e372a9ca1b04","Type":"ContainerStarted","Data":"95b710504e843b1adaa747cda1ceed0cfff7d23d3901dd786e9517db4b1e363b"} Mar 19 00:39:27 crc kubenswrapper[4745]: I0319 00:39:27.214844 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-45wdk/must-gather-j98n7" event={"ID":"08615e98-17b3-40c9-8b9b-e372a9ca1b04","Type":"ContainerStarted","Data":"9910ba3cd8a666005f7972e55a88ec614e51f3eed198ea44dfa464904fab77e2"} Mar 19 00:39:27 crc kubenswrapper[4745]: I0319 00:39:27.216012 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-45wdk/must-gather-j98n7" event={"ID":"08615e98-17b3-40c9-8b9b-e372a9ca1b04","Type":"ContainerStarted","Data":"e08cee8d6016575c3039d3145b9b52e25c4da1ffbc14b55a1a247732682d9c5a"} Mar 19 00:39:27 crc kubenswrapper[4745]: I0319 00:39:27.237038 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-45wdk/must-gather-j98n7" podStartSLOduration=3.074163595 podStartE2EDuration="10.23701552s" podCreationTimestamp="2026-03-19 00:39:17 +0000 UTC" firstStartedPulling="2026-03-19 00:39:18.868534723 +0000 UTC m=+1923.406729854" lastFinishedPulling="2026-03-19 00:39:26.031386648 +0000 UTC m=+1930.569581779" observedRunningTime="2026-03-19 00:39:27.230767604 +0000 UTC m=+1931.768962755" watchObservedRunningTime="2026-03-19 00:39:27.23701552 +0000 UTC m=+1931.775210651" Mar 19 00:39:36 crc kubenswrapper[4745]: I0319 00:39:36.055561 4745 scope.go:117] "RemoveContainer" containerID="3d3abcaceec0d44feeaa99fe5fc507d2939843b4e5f2688e33f19d72f84aabe1" Mar 19 00:39:36 crc kubenswrapper[4745]: I0319 00:39:36.098962 4745 scope.go:117] "RemoveContainer" containerID="41cdf9f33044f6a4909a9e2e26ad76fb6b92253759abe1d7140516760d28b75c" Mar 19 00:39:43 crc kubenswrapper[4745]: I0319 00:39:43.227852 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-629gf"] Mar 19 00:39:43 crc kubenswrapper[4745]: I0319 00:39:43.229579 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:43 crc kubenswrapper[4745]: I0319 00:39:43.239738 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-629gf"] Mar 19 00:39:43 crc kubenswrapper[4745]: I0319 00:39:43.363718 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr7tg\" (UniqueName: \"kubernetes.io/projected/f5af6c85-4fb0-4045-b2cb-f96258977b97-kube-api-access-vr7tg\") pod \"infrawatch-operators-629gf\" (UID: \"f5af6c85-4fb0-4045-b2cb-f96258977b97\") " pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:43 crc kubenswrapper[4745]: I0319 00:39:43.464993 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr7tg\" (UniqueName: \"kubernetes.io/projected/f5af6c85-4fb0-4045-b2cb-f96258977b97-kube-api-access-vr7tg\") pod \"infrawatch-operators-629gf\" (UID: \"f5af6c85-4fb0-4045-b2cb-f96258977b97\") " pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:43 crc kubenswrapper[4745]: I0319 00:39:43.489174 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr7tg\" (UniqueName: \"kubernetes.io/projected/f5af6c85-4fb0-4045-b2cb-f96258977b97-kube-api-access-vr7tg\") pod \"infrawatch-operators-629gf\" (UID: \"f5af6c85-4fb0-4045-b2cb-f96258977b97\") " pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:43 crc kubenswrapper[4745]: I0319 00:39:43.563551 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:43 crc kubenswrapper[4745]: I0319 00:39:43.908052 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-629gf"] Mar 19 00:39:44 crc kubenswrapper[4745]: I0319 00:39:44.379123 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-629gf" event={"ID":"f5af6c85-4fb0-4045-b2cb-f96258977b97","Type":"ContainerStarted","Data":"c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45"} Mar 19 00:39:44 crc kubenswrapper[4745]: I0319 00:39:44.379621 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-629gf" event={"ID":"f5af6c85-4fb0-4045-b2cb-f96258977b97","Type":"ContainerStarted","Data":"d38bfb040a14025f30b9650d668f1ed36e039a8469a10993a3a97e36ef4d5129"} Mar 19 00:39:44 crc kubenswrapper[4745]: I0319 00:39:44.397900 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-629gf" podStartSLOduration=1.2599673120000001 podStartE2EDuration="1.397864081s" podCreationTimestamp="2026-03-19 00:39:43 +0000 UTC" firstStartedPulling="2026-03-19 00:39:43.907610686 +0000 UTC m=+1948.445805817" lastFinishedPulling="2026-03-19 00:39:44.045507455 +0000 UTC m=+1948.583702586" observedRunningTime="2026-03-19 00:39:44.394343631 +0000 UTC m=+1948.932538762" watchObservedRunningTime="2026-03-19 00:39:44.397864081 +0000 UTC m=+1948.936059212" Mar 19 00:39:53 crc kubenswrapper[4745]: I0319 00:39:53.564394 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:53 crc kubenswrapper[4745]: I0319 00:39:53.564905 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:53 crc kubenswrapper[4745]: I0319 00:39:53.600959 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:54 crc kubenswrapper[4745]: I0319 00:39:54.482568 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:54 crc kubenswrapper[4745]: I0319 00:39:54.531588 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-629gf"] Mar 19 00:39:56 crc kubenswrapper[4745]: I0319 00:39:56.473111 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-629gf" podUID="f5af6c85-4fb0-4045-b2cb-f96258977b97" containerName="registry-server" containerID="cri-o://c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45" gracePeriod=2 Mar 19 00:39:56 crc kubenswrapper[4745]: I0319 00:39:56.861063 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:56 crc kubenswrapper[4745]: I0319 00:39:56.980181 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr7tg\" (UniqueName: \"kubernetes.io/projected/f5af6c85-4fb0-4045-b2cb-f96258977b97-kube-api-access-vr7tg\") pod \"f5af6c85-4fb0-4045-b2cb-f96258977b97\" (UID: \"f5af6c85-4fb0-4045-b2cb-f96258977b97\") " Mar 19 00:39:56 crc kubenswrapper[4745]: I0319 00:39:56.987675 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5af6c85-4fb0-4045-b2cb-f96258977b97-kube-api-access-vr7tg" (OuterVolumeSpecName: "kube-api-access-vr7tg") pod "f5af6c85-4fb0-4045-b2cb-f96258977b97" (UID: "f5af6c85-4fb0-4045-b2cb-f96258977b97"). InnerVolumeSpecName "kube-api-access-vr7tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:39:57 crc kubenswrapper[4745]: I0319 00:39:57.081821 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr7tg\" (UniqueName: \"kubernetes.io/projected/f5af6c85-4fb0-4045-b2cb-f96258977b97-kube-api-access-vr7tg\") on node \"crc\" DevicePath \"\"" Mar 19 00:39:57 crc kubenswrapper[4745]: I0319 00:39:57.482289 4745 generic.go:334] "Generic (PLEG): container finished" podID="f5af6c85-4fb0-4045-b2cb-f96258977b97" containerID="c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45" exitCode=0 Mar 19 00:39:57 crc kubenswrapper[4745]: I0319 00:39:57.482349 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-629gf" event={"ID":"f5af6c85-4fb0-4045-b2cb-f96258977b97","Type":"ContainerDied","Data":"c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45"} Mar 19 00:39:57 crc kubenswrapper[4745]: I0319 00:39:57.482389 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-629gf" event={"ID":"f5af6c85-4fb0-4045-b2cb-f96258977b97","Type":"ContainerDied","Data":"d38bfb040a14025f30b9650d668f1ed36e039a8469a10993a3a97e36ef4d5129"} Mar 19 00:39:57 crc kubenswrapper[4745]: I0319 00:39:57.482417 4745 scope.go:117] "RemoveContainer" containerID="c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45" Mar 19 00:39:57 crc kubenswrapper[4745]: I0319 00:39:57.482626 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:57 crc kubenswrapper[4745]: I0319 00:39:57.513785 4745 scope.go:117] "RemoveContainer" containerID="c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45" Mar 19 00:39:57 crc kubenswrapper[4745]: E0319 00:39:57.514437 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45\": container with ID starting with c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45 not found: ID does not exist" containerID="c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45" Mar 19 00:39:57 crc kubenswrapper[4745]: I0319 00:39:57.514505 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45"} err="failed to get container status \"c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45\": rpc error: code = NotFound desc = could not find container \"c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45\": container with ID starting with c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45 not found: ID does not exist" Mar 19 00:39:57 crc kubenswrapper[4745]: I0319 00:39:57.517806 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-629gf"] Mar 19 00:39:57 crc kubenswrapper[4745]: I0319 00:39:57.527467 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-629gf"] Mar 19 00:39:58 crc kubenswrapper[4745]: I0319 00:39:58.147189 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5af6c85-4fb0-4045-b2cb-f96258977b97" path="/var/lib/kubelet/pods/f5af6c85-4fb0-4045-b2cb-f96258977b97/volumes" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.147731 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564680-w752m"] Mar 19 00:40:00 crc kubenswrapper[4745]: E0319 00:40:00.148529 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5af6c85-4fb0-4045-b2cb-f96258977b97" containerName="registry-server" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.148552 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5af6c85-4fb0-4045-b2cb-f96258977b97" containerName="registry-server" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.148706 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5af6c85-4fb0-4045-b2cb-f96258977b97" containerName="registry-server" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.149296 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564680-w752m" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.155898 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.156114 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564680-w752m"] Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.156187 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.156389 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.235625 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-765v7\" (UniqueName: \"kubernetes.io/projected/2afbbf4f-151b-4d25-9658-58353102abde-kube-api-access-765v7\") pod \"auto-csr-approver-29564680-w752m\" (UID: \"2afbbf4f-151b-4d25-9658-58353102abde\") " pod="openshift-infra/auto-csr-approver-29564680-w752m" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.338012 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-765v7\" (UniqueName: \"kubernetes.io/projected/2afbbf4f-151b-4d25-9658-58353102abde-kube-api-access-765v7\") pod \"auto-csr-approver-29564680-w752m\" (UID: \"2afbbf4f-151b-4d25-9658-58353102abde\") " pod="openshift-infra/auto-csr-approver-29564680-w752m" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.371540 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-765v7\" (UniqueName: \"kubernetes.io/projected/2afbbf4f-151b-4d25-9658-58353102abde-kube-api-access-765v7\") pod \"auto-csr-approver-29564680-w752m\" (UID: \"2afbbf4f-151b-4d25-9658-58353102abde\") " pod="openshift-infra/auto-csr-approver-29564680-w752m" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.468013 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564680-w752m" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.691216 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564680-w752m"] Mar 19 00:40:01 crc kubenswrapper[4745]: I0319 00:40:01.516647 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564680-w752m" event={"ID":"2afbbf4f-151b-4d25-9658-58353102abde","Type":"ContainerStarted","Data":"86721070d9a01d8c5d6051733892503901e00f811a580bb234089aa495d0de9f"} Mar 19 00:40:02 crc kubenswrapper[4745]: I0319 00:40:02.528831 4745 generic.go:334] "Generic (PLEG): container finished" podID="2afbbf4f-151b-4d25-9658-58353102abde" containerID="b11ec2f181d1b8a78640375e23f561adc4550277264f46fc103ead95a4d312d9" exitCode=0 Mar 19 00:40:02 crc kubenswrapper[4745]: I0319 00:40:02.528910 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564680-w752m" event={"ID":"2afbbf4f-151b-4d25-9658-58353102abde","Type":"ContainerDied","Data":"b11ec2f181d1b8a78640375e23f561adc4550277264f46fc103ead95a4d312d9"} Mar 19 00:40:03 crc kubenswrapper[4745]: I0319 00:40:03.790508 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564680-w752m" Mar 19 00:40:03 crc kubenswrapper[4745]: I0319 00:40:03.897238 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-765v7\" (UniqueName: \"kubernetes.io/projected/2afbbf4f-151b-4d25-9658-58353102abde-kube-api-access-765v7\") pod \"2afbbf4f-151b-4d25-9658-58353102abde\" (UID: \"2afbbf4f-151b-4d25-9658-58353102abde\") " Mar 19 00:40:03 crc kubenswrapper[4745]: I0319 00:40:03.905458 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2afbbf4f-151b-4d25-9658-58353102abde-kube-api-access-765v7" (OuterVolumeSpecName: "kube-api-access-765v7") pod "2afbbf4f-151b-4d25-9658-58353102abde" (UID: "2afbbf4f-151b-4d25-9658-58353102abde"). InnerVolumeSpecName "kube-api-access-765v7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:40:03 crc kubenswrapper[4745]: I0319 00:40:03.999628 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-765v7\" (UniqueName: \"kubernetes.io/projected/2afbbf4f-151b-4d25-9658-58353102abde-kube-api-access-765v7\") on node \"crc\" DevicePath \"\"" Mar 19 00:40:04 crc kubenswrapper[4745]: I0319 00:40:04.549473 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564680-w752m" event={"ID":"2afbbf4f-151b-4d25-9658-58353102abde","Type":"ContainerDied","Data":"86721070d9a01d8c5d6051733892503901e00f811a580bb234089aa495d0de9f"} Mar 19 00:40:04 crc kubenswrapper[4745]: I0319 00:40:04.549920 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86721070d9a01d8c5d6051733892503901e00f811a580bb234089aa495d0de9f" Mar 19 00:40:04 crc kubenswrapper[4745]: I0319 00:40:04.549512 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564680-w752m" Mar 19 00:40:04 crc kubenswrapper[4745]: I0319 00:40:04.866581 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564674-2zhcp"] Mar 19 00:40:04 crc kubenswrapper[4745]: I0319 00:40:04.873219 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564674-2zhcp"] Mar 19 00:40:06 crc kubenswrapper[4745]: I0319 00:40:06.147599 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9fad81f-d73f-4e01-9a07-66b20741533e" path="/var/lib/kubelet/pods/c9fad81f-d73f-4e01-9a07-66b20741533e/volumes" Mar 19 00:40:08 crc kubenswrapper[4745]: I0319 00:40:08.612153 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-fztjk_cb0a157b-0f6d-4738-ae67-e29407c2ba8e/control-plane-machine-set-operator/0.log" Mar 19 00:40:08 crc kubenswrapper[4745]: I0319 00:40:08.774288 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hg72d_660e3fac-6534-49e0-a81e-38971c9fec3f/kube-rbac-proxy/0.log" Mar 19 00:40:08 crc kubenswrapper[4745]: I0319 00:40:08.818725 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hg72d_660e3fac-6534-49e0-a81e-38971c9fec3f/machine-api-operator/0.log" Mar 19 00:40:20 crc kubenswrapper[4745]: I0319 00:40:20.680340 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-ptrd5_93f48ad8-0863-4d90-abac-b887096b386c/cert-manager-controller/0.log" Mar 19 00:40:20 crc kubenswrapper[4745]: I0319 00:40:20.807351 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-cpzpl_bbe8b718-863a-404e-9be9-e872318f1ac0/cert-manager-cainjector/0.log" Mar 19 00:40:20 crc kubenswrapper[4745]: I0319 00:40:20.860849 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-vr5md_9f7ceaac-a9f7-467b-83c9-298813ff6323/cert-manager-webhook/0.log" Mar 19 00:40:34 crc kubenswrapper[4745]: I0319 00:40:34.761458 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-tz2rm_bcf530c8-afe8-4a0e-9e5c-bfd85712e37a/prometheus-operator/0.log" Mar 19 00:40:34 crc kubenswrapper[4745]: I0319 00:40:34.881005 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2_f5c4fe84-51cb-479a-a8cc-2e07bde21417/prometheus-operator-admission-webhook/0.log" Mar 19 00:40:34 crc kubenswrapper[4745]: I0319 00:40:34.957158 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78_69d14850-5c50-4c06-8581-2a70644c7de7/prometheus-operator-admission-webhook/0.log" Mar 19 00:40:35 crc kubenswrapper[4745]: I0319 00:40:35.085897 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-shlz7_a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534/operator/0.log" Mar 19 00:40:35 crc kubenswrapper[4745]: I0319 00:40:35.184583 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-78548ff687-rjvkn_58d05d23-3632-4b84-94f8-1db548b90a03/perses-operator/0.log" Mar 19 00:40:36 crc kubenswrapper[4745]: I0319 00:40:36.170032 4745 scope.go:117] "RemoveContainer" containerID="6316938ec69e0de5c06031d730ac5d55047c4061f5dea80ee35cf89954f73a68" Mar 19 00:40:49 crc kubenswrapper[4745]: I0319 00:40:49.325214 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2_9ef9c16b-de5e-456d-899c-15bdcfba6c89/util/0.log" Mar 19 00:40:49 crc kubenswrapper[4745]: I0319 00:40:49.513848 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2_9ef9c16b-de5e-456d-899c-15bdcfba6c89/pull/0.log" Mar 19 00:40:49 crc kubenswrapper[4745]: I0319 00:40:49.529013 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2_9ef9c16b-de5e-456d-899c-15bdcfba6c89/util/0.log" Mar 19 00:40:49 crc kubenswrapper[4745]: I0319 00:40:49.546574 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2_9ef9c16b-de5e-456d-899c-15bdcfba6c89/pull/0.log" Mar 19 00:40:49 crc kubenswrapper[4745]: I0319 00:40:49.734131 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2_9ef9c16b-de5e-456d-899c-15bdcfba6c89/util/0.log" Mar 19 00:40:49 crc kubenswrapper[4745]: I0319 00:40:49.735014 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2_9ef9c16b-de5e-456d-899c-15bdcfba6c89/pull/0.log" Mar 19 00:40:49 crc kubenswrapper[4745]: I0319 00:40:49.768750 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2_9ef9c16b-de5e-456d-899c-15bdcfba6c89/extract/0.log" Mar 19 00:40:49 crc kubenswrapper[4745]: I0319 00:40:49.932354 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892_8baea7f9-4007-48cc-a849-7b8ce10c526b/util/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.102081 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892_8baea7f9-4007-48cc-a849-7b8ce10c526b/pull/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.132364 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892_8baea7f9-4007-48cc-a849-7b8ce10c526b/util/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.137093 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892_8baea7f9-4007-48cc-a849-7b8ce10c526b/pull/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.393277 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892_8baea7f9-4007-48cc-a849-7b8ce10c526b/util/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.403110 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892_8baea7f9-4007-48cc-a849-7b8ce10c526b/pull/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.429895 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892_8baea7f9-4007-48cc-a849-7b8ce10c526b/extract/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.555943 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l_4ef969f2-d76b-405e-baaf-c10a36d36ed3/util/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.722694 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l_4ef969f2-d76b-405e-baaf-c10a36d36ed3/util/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.745566 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l_4ef969f2-d76b-405e-baaf-c10a36d36ed3/pull/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.782008 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l_4ef969f2-d76b-405e-baaf-c10a36d36ed3/pull/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.928293 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l_4ef969f2-d76b-405e-baaf-c10a36d36ed3/util/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.956160 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l_4ef969f2-d76b-405e-baaf-c10a36d36ed3/extract/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.959642 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l_4ef969f2-d76b-405e-baaf-c10a36d36ed3/pull/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.090648 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d_6e2c290f-c398-4c6e-9dec-82038e0bda08/util/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.269113 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d_6e2c290f-c398-4c6e-9dec-82038e0bda08/pull/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.286246 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d_6e2c290f-c398-4c6e-9dec-82038e0bda08/pull/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.289741 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d_6e2c290f-c398-4c6e-9dec-82038e0bda08/util/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.471194 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d_6e2c290f-c398-4c6e-9dec-82038e0bda08/extract/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.480400 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d_6e2c290f-c398-4c6e-9dec-82038e0bda08/pull/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.497740 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d_6e2c290f-c398-4c6e-9dec-82038e0bda08/util/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.657598 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s8x87_7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b/extract-utilities/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.826955 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s8x87_7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b/extract-utilities/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.846595 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s8x87_7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b/extract-content/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.894158 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s8x87_7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b/extract-content/0.log" Mar 19 00:40:52 crc kubenswrapper[4745]: I0319 00:40:52.067163 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s8x87_7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b/extract-utilities/0.log" Mar 19 00:40:52 crc kubenswrapper[4745]: I0319 00:40:52.075978 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s8x87_7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b/extract-content/0.log" Mar 19 00:40:52 crc kubenswrapper[4745]: I0319 00:40:52.438191 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s8x87_7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b/registry-server/0.log" Mar 19 00:40:52 crc kubenswrapper[4745]: I0319 00:40:52.473664 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pq2gm_0ff66d22-2b4c-4f11-acfe-06173ee9a07e/extract-utilities/0.log" Mar 19 00:40:52 crc kubenswrapper[4745]: I0319 00:40:52.654574 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pq2gm_0ff66d22-2b4c-4f11-acfe-06173ee9a07e/extract-content/0.log" Mar 19 00:40:52 crc kubenswrapper[4745]: I0319 00:40:52.688847 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pq2gm_0ff66d22-2b4c-4f11-acfe-06173ee9a07e/extract-utilities/0.log" Mar 19 00:40:52 crc kubenswrapper[4745]: I0319 00:40:52.714029 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pq2gm_0ff66d22-2b4c-4f11-acfe-06173ee9a07e/extract-content/0.log" Mar 19 00:40:52 crc kubenswrapper[4745]: I0319 00:40:52.835736 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pq2gm_0ff66d22-2b4c-4f11-acfe-06173ee9a07e/extract-utilities/0.log" Mar 19 00:40:52 crc kubenswrapper[4745]: I0319 00:40:52.872244 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pq2gm_0ff66d22-2b4c-4f11-acfe-06173ee9a07e/extract-content/0.log" Mar 19 00:40:53 crc kubenswrapper[4745]: I0319 00:40:53.084167 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-hcgn8_9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06/marketplace-operator/0.log" Mar 19 00:40:53 crc kubenswrapper[4745]: I0319 00:40:53.249321 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6ddfl_e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5/extract-utilities/0.log" Mar 19 00:40:53 crc kubenswrapper[4745]: I0319 00:40:53.337129 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pq2gm_0ff66d22-2b4c-4f11-acfe-06173ee9a07e/registry-server/0.log" Mar 19 00:40:53 crc kubenswrapper[4745]: I0319 00:40:53.465972 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6ddfl_e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5/extract-content/0.log" Mar 19 00:40:53 crc kubenswrapper[4745]: I0319 00:40:53.505345 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6ddfl_e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5/extract-content/0.log" Mar 19 00:40:53 crc kubenswrapper[4745]: I0319 00:40:53.517823 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6ddfl_e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5/extract-utilities/0.log" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.300733 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6ddfl_e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5/extract-utilities/0.log" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.319970 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6ddfl_e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5/extract-content/0.log" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.714904 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k2n7t"] Mar 19 00:40:54 crc kubenswrapper[4745]: E0319 00:40:54.715358 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2afbbf4f-151b-4d25-9658-58353102abde" containerName="oc" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.715380 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2afbbf4f-151b-4d25-9658-58353102abde" containerName="oc" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.715575 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="2afbbf4f-151b-4d25-9658-58353102abde" containerName="oc" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.721225 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.738607 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k2n7t"] Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.854337 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-catalog-content\") pod \"community-operators-k2n7t\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.854462 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpb4x\" (UniqueName: \"kubernetes.io/projected/2941df91-78ca-4017-94ec-60f34ac379a1-kube-api-access-qpb4x\") pod \"community-operators-k2n7t\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.854500 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-utilities\") pod \"community-operators-k2n7t\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.956370 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-catalog-content\") pod \"community-operators-k2n7t\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.956451 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpb4x\" (UniqueName: \"kubernetes.io/projected/2941df91-78ca-4017-94ec-60f34ac379a1-kube-api-access-qpb4x\") pod \"community-operators-k2n7t\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.956483 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-utilities\") pod \"community-operators-k2n7t\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.957103 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-utilities\") pod \"community-operators-k2n7t\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.957934 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-catalog-content\") pod \"community-operators-k2n7t\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.986491 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpb4x\" (UniqueName: \"kubernetes.io/projected/2941df91-78ca-4017-94ec-60f34ac379a1-kube-api-access-qpb4x\") pod \"community-operators-k2n7t\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:55 crc kubenswrapper[4745]: I0319 00:40:55.375940 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:55 crc kubenswrapper[4745]: I0319 00:40:55.401975 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6ddfl_e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5/registry-server/0.log" Mar 19 00:40:55 crc kubenswrapper[4745]: I0319 00:40:55.978373 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k2n7t"] Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.485434 4745 generic.go:334] "Generic (PLEG): container finished" podID="2941df91-78ca-4017-94ec-60f34ac379a1" containerID="07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8" exitCode=0 Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.485510 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2n7t" event={"ID":"2941df91-78ca-4017-94ec-60f34ac379a1","Type":"ContainerDied","Data":"07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8"} Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.485829 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2n7t" event={"ID":"2941df91-78ca-4017-94ec-60f34ac379a1","Type":"ContainerStarted","Data":"66be26ed0cc5fc431474d754705969bca8f867ddcc5eb902f18bf1d6f9dc4737"} Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.553369 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7tnv7"] Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.555106 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.571990 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7tnv7"] Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.697838 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7sf4\" (UniqueName: \"kubernetes.io/projected/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-kube-api-access-g7sf4\") pod \"certified-operators-7tnv7\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.697917 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-utilities\") pod \"certified-operators-7tnv7\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.697953 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-catalog-content\") pod \"certified-operators-7tnv7\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.799416 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7sf4\" (UniqueName: \"kubernetes.io/projected/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-kube-api-access-g7sf4\") pod \"certified-operators-7tnv7\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.799979 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-utilities\") pod \"certified-operators-7tnv7\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.800038 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-catalog-content\") pod \"certified-operators-7tnv7\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.800740 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-catalog-content\") pod \"certified-operators-7tnv7\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.801222 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-utilities\") pod \"certified-operators-7tnv7\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.823369 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7sf4\" (UniqueName: \"kubernetes.io/projected/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-kube-api-access-g7sf4\") pod \"certified-operators-7tnv7\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.877999 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:58 crc kubenswrapper[4745]: I0319 00:40:58.592140 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7tnv7"] Mar 19 00:40:58 crc kubenswrapper[4745]: W0319 00:40:58.602440 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8674bfe_de0e_4acb_a1c3_e8a3bdca029c.slice/crio-180e4f33d148eb18a584113ea9b1a9b44e153f30ce8ff2099f5400b22fe52506 WatchSource:0}: Error finding container 180e4f33d148eb18a584113ea9b1a9b44e153f30ce8ff2099f5400b22fe52506: Status 404 returned error can't find the container with id 180e4f33d148eb18a584113ea9b1a9b44e153f30ce8ff2099f5400b22fe52506 Mar 19 00:40:58 crc kubenswrapper[4745]: I0319 00:40:58.907777 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7tnv7" event={"ID":"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c","Type":"ContainerStarted","Data":"180e4f33d148eb18a584113ea9b1a9b44e153f30ce8ff2099f5400b22fe52506"} Mar 19 00:41:00 crc kubenswrapper[4745]: I0319 00:41:00.209590 4745 generic.go:334] "Generic (PLEG): container finished" podID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerID="7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139" exitCode=0 Mar 19 00:41:00 crc kubenswrapper[4745]: I0319 00:41:00.209788 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7tnv7" event={"ID":"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c","Type":"ContainerDied","Data":"7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139"} Mar 19 00:41:01 crc kubenswrapper[4745]: I0319 00:41:01.220868 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7tnv7" event={"ID":"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c","Type":"ContainerStarted","Data":"00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1"} Mar 19 00:41:01 crc kubenswrapper[4745]: I0319 00:41:01.223649 4745 generic.go:334] "Generic (PLEG): container finished" podID="2941df91-78ca-4017-94ec-60f34ac379a1" containerID="7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126" exitCode=0 Mar 19 00:41:01 crc kubenswrapper[4745]: I0319 00:41:01.223723 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2n7t" event={"ID":"2941df91-78ca-4017-94ec-60f34ac379a1","Type":"ContainerDied","Data":"7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126"} Mar 19 00:41:02 crc kubenswrapper[4745]: I0319 00:41:02.234264 4745 generic.go:334] "Generic (PLEG): container finished" podID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerID="00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1" exitCode=0 Mar 19 00:41:02 crc kubenswrapper[4745]: I0319 00:41:02.234367 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7tnv7" event={"ID":"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c","Type":"ContainerDied","Data":"00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1"} Mar 19 00:41:02 crc kubenswrapper[4745]: I0319 00:41:02.238092 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2n7t" event={"ID":"2941df91-78ca-4017-94ec-60f34ac379a1","Type":"ContainerStarted","Data":"768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478"} Mar 19 00:41:02 crc kubenswrapper[4745]: I0319 00:41:02.291418 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k2n7t" podStartSLOduration=2.944246911 podStartE2EDuration="8.291390834s" podCreationTimestamp="2026-03-19 00:40:54 +0000 UTC" firstStartedPulling="2026-03-19 00:40:56.488517523 +0000 UTC m=+2021.026712654" lastFinishedPulling="2026-03-19 00:41:01.835661446 +0000 UTC m=+2026.373856577" observedRunningTime="2026-03-19 00:41:02.284482404 +0000 UTC m=+2026.822677555" watchObservedRunningTime="2026-03-19 00:41:02.291390834 +0000 UTC m=+2026.829585965" Mar 19 00:41:03 crc kubenswrapper[4745]: I0319 00:41:03.250398 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7tnv7" event={"ID":"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c","Type":"ContainerStarted","Data":"b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf"} Mar 19 00:41:03 crc kubenswrapper[4745]: I0319 00:41:03.276487 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7tnv7" podStartSLOduration=4.842246496 podStartE2EDuration="7.276463797s" podCreationTimestamp="2026-03-19 00:40:56 +0000 UTC" firstStartedPulling="2026-03-19 00:41:00.213564675 +0000 UTC m=+2024.751759806" lastFinishedPulling="2026-03-19 00:41:02.647781976 +0000 UTC m=+2027.185977107" observedRunningTime="2026-03-19 00:41:03.275359262 +0000 UTC m=+2027.813554403" watchObservedRunningTime="2026-03-19 00:41:03.276463797 +0000 UTC m=+2027.814658928" Mar 19 00:41:05 crc kubenswrapper[4745]: I0319 00:41:05.377408 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:41:05 crc kubenswrapper[4745]: I0319 00:41:05.377921 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:41:05 crc kubenswrapper[4745]: I0319 00:41:05.433802 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:41:06 crc kubenswrapper[4745]: I0319 00:41:06.879299 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:41:06 crc kubenswrapper[4745]: I0319 00:41:06.880651 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:41:06 crc kubenswrapper[4745]: I0319 00:41:06.921927 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:41:07 crc kubenswrapper[4745]: I0319 00:41:07.327826 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:41:07 crc kubenswrapper[4745]: I0319 00:41:07.328398 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:41:07 crc kubenswrapper[4745]: I0319 00:41:07.946526 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7tnv7"] Mar 19 00:41:09 crc kubenswrapper[4745]: I0319 00:41:09.294678 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7tnv7" podUID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerName="registry-server" containerID="cri-o://b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf" gracePeriod=2 Mar 19 00:41:09 crc kubenswrapper[4745]: I0319 00:41:09.745034 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k2n7t"] Mar 19 00:41:09 crc kubenswrapper[4745]: I0319 00:41:09.745310 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k2n7t" podUID="2941df91-78ca-4017-94ec-60f34ac379a1" containerName="registry-server" containerID="cri-o://768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478" gracePeriod=2 Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.273869 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.279164 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.307317 4745 generic.go:334] "Generic (PLEG): container finished" podID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerID="b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf" exitCode=0 Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.308246 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.308678 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7tnv7" event={"ID":"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c","Type":"ContainerDied","Data":"b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf"} Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.308721 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7tnv7" event={"ID":"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c","Type":"ContainerDied","Data":"180e4f33d148eb18a584113ea9b1a9b44e153f30ce8ff2099f5400b22fe52506"} Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.308740 4745 scope.go:117] "RemoveContainer" containerID="b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.320509 4745 generic.go:334] "Generic (PLEG): container finished" podID="2941df91-78ca-4017-94ec-60f34ac379a1" containerID="768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478" exitCode=0 Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.320586 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2n7t" event={"ID":"2941df91-78ca-4017-94ec-60f34ac379a1","Type":"ContainerDied","Data":"768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478"} Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.320643 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2n7t" event={"ID":"2941df91-78ca-4017-94ec-60f34ac379a1","Type":"ContainerDied","Data":"66be26ed0cc5fc431474d754705969bca8f867ddcc5eb902f18bf1d6f9dc4737"} Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.320753 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.333686 4745 scope.go:117] "RemoveContainer" containerID="00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.341768 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpb4x\" (UniqueName: \"kubernetes.io/projected/2941df91-78ca-4017-94ec-60f34ac379a1-kube-api-access-qpb4x\") pod \"2941df91-78ca-4017-94ec-60f34ac379a1\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.341909 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-catalog-content\") pod \"2941df91-78ca-4017-94ec-60f34ac379a1\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.341975 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-utilities\") pod \"2941df91-78ca-4017-94ec-60f34ac379a1\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.342034 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7sf4\" (UniqueName: \"kubernetes.io/projected/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-kube-api-access-g7sf4\") pod \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.342085 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-utilities\") pod \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.342125 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-catalog-content\") pod \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.345455 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-utilities" (OuterVolumeSpecName: "utilities") pod "2941df91-78ca-4017-94ec-60f34ac379a1" (UID: "2941df91-78ca-4017-94ec-60f34ac379a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.350084 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-utilities" (OuterVolumeSpecName: "utilities") pod "d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" (UID: "d8674bfe-de0e-4acb-a1c3-e8a3bdca029c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.372177 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-kube-api-access-g7sf4" (OuterVolumeSpecName: "kube-api-access-g7sf4") pod "d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" (UID: "d8674bfe-de0e-4acb-a1c3-e8a3bdca029c"). InnerVolumeSpecName "kube-api-access-g7sf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.372266 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2941df91-78ca-4017-94ec-60f34ac379a1-kube-api-access-qpb4x" (OuterVolumeSpecName: "kube-api-access-qpb4x") pod "2941df91-78ca-4017-94ec-60f34ac379a1" (UID: "2941df91-78ca-4017-94ec-60f34ac379a1"). InnerVolumeSpecName "kube-api-access-qpb4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.407245 4745 scope.go:117] "RemoveContainer" containerID="7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.444817 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.444855 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7sf4\" (UniqueName: \"kubernetes.io/projected/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-kube-api-access-g7sf4\") on node \"crc\" DevicePath \"\"" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.444865 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.444873 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpb4x\" (UniqueName: \"kubernetes.io/projected/2941df91-78ca-4017-94ec-60f34ac379a1-kube-api-access-qpb4x\") on node \"crc\" DevicePath \"\"" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.477383 4745 scope.go:117] "RemoveContainer" containerID="b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf" Mar 19 00:41:10 crc kubenswrapper[4745]: E0319 00:41:10.478143 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf\": container with ID starting with b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf not found: ID does not exist" containerID="b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.478175 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf"} err="failed to get container status \"b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf\": rpc error: code = NotFound desc = could not find container \"b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf\": container with ID starting with b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf not found: ID does not exist" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.478210 4745 scope.go:117] "RemoveContainer" containerID="00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1" Mar 19 00:41:10 crc kubenswrapper[4745]: E0319 00:41:10.478730 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1\": container with ID starting with 00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1 not found: ID does not exist" containerID="00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.478753 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1"} err="failed to get container status \"00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1\": rpc error: code = NotFound desc = could not find container \"00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1\": container with ID starting with 00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1 not found: ID does not exist" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.478769 4745 scope.go:117] "RemoveContainer" containerID="7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139" Mar 19 00:41:10 crc kubenswrapper[4745]: E0319 00:41:10.479380 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139\": container with ID starting with 7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139 not found: ID does not exist" containerID="7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.479445 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139"} err="failed to get container status \"7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139\": rpc error: code = NotFound desc = could not find container \"7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139\": container with ID starting with 7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139 not found: ID does not exist" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.479478 4745 scope.go:117] "RemoveContainer" containerID="768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.495133 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" (UID: "d8674bfe-de0e-4acb-a1c3-e8a3bdca029c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.504603 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2941df91-78ca-4017-94ec-60f34ac379a1" (UID: "2941df91-78ca-4017-94ec-60f34ac379a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.506677 4745 scope.go:117] "RemoveContainer" containerID="7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.529186 4745 scope.go:117] "RemoveContainer" containerID="07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.546843 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.546900 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.555938 4745 scope.go:117] "RemoveContainer" containerID="768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478" Mar 19 00:41:10 crc kubenswrapper[4745]: E0319 00:41:10.556635 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478\": container with ID starting with 768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478 not found: ID does not exist" containerID="768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.556686 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478"} err="failed to get container status \"768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478\": rpc error: code = NotFound desc = could not find container \"768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478\": container with ID starting with 768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478 not found: ID does not exist" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.556817 4745 scope.go:117] "RemoveContainer" containerID="7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126" Mar 19 00:41:10 crc kubenswrapper[4745]: E0319 00:41:10.557238 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126\": container with ID starting with 7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126 not found: ID does not exist" containerID="7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.557291 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126"} err="failed to get container status \"7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126\": rpc error: code = NotFound desc = could not find container \"7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126\": container with ID starting with 7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126 not found: ID does not exist" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.557335 4745 scope.go:117] "RemoveContainer" containerID="07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8" Mar 19 00:41:10 crc kubenswrapper[4745]: E0319 00:41:10.557751 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8\": container with ID starting with 07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8 not found: ID does not exist" containerID="07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.557823 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8"} err="failed to get container status \"07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8\": rpc error: code = NotFound desc = could not find container \"07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8\": container with ID starting with 07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8 not found: ID does not exist" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.642782 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7tnv7"] Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.653978 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7tnv7"] Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.662562 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k2n7t"] Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.671415 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k2n7t"] Mar 19 00:41:11 crc kubenswrapper[4745]: I0319 00:41:11.243092 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-tz2rm_bcf530c8-afe8-4a0e-9e5c-bfd85712e37a/prometheus-operator/0.log" Mar 19 00:41:11 crc kubenswrapper[4745]: I0319 00:41:11.295962 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2_f5c4fe84-51cb-479a-a8cc-2e07bde21417/prometheus-operator-admission-webhook/0.log" Mar 19 00:41:11 crc kubenswrapper[4745]: I0319 00:41:11.348813 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78_69d14850-5c50-4c06-8581-2a70644c7de7/prometheus-operator-admission-webhook/0.log" Mar 19 00:41:11 crc kubenswrapper[4745]: I0319 00:41:11.431598 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-shlz7_a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534/operator/0.log" Mar 19 00:41:11 crc kubenswrapper[4745]: I0319 00:41:11.530008 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-78548ff687-rjvkn_58d05d23-3632-4b84-94f8-1db548b90a03/perses-operator/0.log" Mar 19 00:41:12 crc kubenswrapper[4745]: I0319 00:41:12.151841 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2941df91-78ca-4017-94ec-60f34ac379a1" path="/var/lib/kubelet/pods/2941df91-78ca-4017-94ec-60f34ac379a1/volumes" Mar 19 00:41:12 crc kubenswrapper[4745]: I0319 00:41:12.152695 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" path="/var/lib/kubelet/pods/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c/volumes" Mar 19 00:41:15 crc kubenswrapper[4745]: I0319 00:41:15.606243 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:41:15 crc kubenswrapper[4745]: I0319 00:41:15.608039 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:41:45 crc kubenswrapper[4745]: I0319 00:41:45.606364 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:41:45 crc kubenswrapper[4745]: I0319 00:41:45.606966 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.153149 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564682-qhdlb"] Mar 19 00:42:00 crc kubenswrapper[4745]: E0319 00:42:00.154197 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2941df91-78ca-4017-94ec-60f34ac379a1" containerName="extract-utilities" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.154210 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2941df91-78ca-4017-94ec-60f34ac379a1" containerName="extract-utilities" Mar 19 00:42:00 crc kubenswrapper[4745]: E0319 00:42:00.154227 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2941df91-78ca-4017-94ec-60f34ac379a1" containerName="registry-server" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.154234 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2941df91-78ca-4017-94ec-60f34ac379a1" containerName="registry-server" Mar 19 00:42:00 crc kubenswrapper[4745]: E0319 00:42:00.154247 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2941df91-78ca-4017-94ec-60f34ac379a1" containerName="extract-content" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.154255 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2941df91-78ca-4017-94ec-60f34ac379a1" containerName="extract-content" Mar 19 00:42:00 crc kubenswrapper[4745]: E0319 00:42:00.154271 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerName="extract-content" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.154276 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerName="extract-content" Mar 19 00:42:00 crc kubenswrapper[4745]: E0319 00:42:00.154288 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerName="extract-utilities" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.154294 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerName="extract-utilities" Mar 19 00:42:00 crc kubenswrapper[4745]: E0319 00:42:00.154310 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerName="registry-server" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.154317 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerName="registry-server" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.154452 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="2941df91-78ca-4017-94ec-60f34ac379a1" containerName="registry-server" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.154465 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerName="registry-server" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.155179 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564682-qhdlb" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.158632 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.158725 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.159610 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.165016 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564682-qhdlb"] Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.326901 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5w9r\" (UniqueName: \"kubernetes.io/projected/cb19dc97-c11c-4239-9119-66b62533468d-kube-api-access-q5w9r\") pod \"auto-csr-approver-29564682-qhdlb\" (UID: \"cb19dc97-c11c-4239-9119-66b62533468d\") " pod="openshift-infra/auto-csr-approver-29564682-qhdlb" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.430351 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5w9r\" (UniqueName: \"kubernetes.io/projected/cb19dc97-c11c-4239-9119-66b62533468d-kube-api-access-q5w9r\") pod \"auto-csr-approver-29564682-qhdlb\" (UID: \"cb19dc97-c11c-4239-9119-66b62533468d\") " pod="openshift-infra/auto-csr-approver-29564682-qhdlb" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.457390 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5w9r\" (UniqueName: \"kubernetes.io/projected/cb19dc97-c11c-4239-9119-66b62533468d-kube-api-access-q5w9r\") pod \"auto-csr-approver-29564682-qhdlb\" (UID: \"cb19dc97-c11c-4239-9119-66b62533468d\") " pod="openshift-infra/auto-csr-approver-29564682-qhdlb" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.483248 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564682-qhdlb" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.711206 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564682-qhdlb"] Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.768236 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564682-qhdlb" event={"ID":"cb19dc97-c11c-4239-9119-66b62533468d","Type":"ContainerStarted","Data":"9eb418b15c5055fa4b6c8739ea732b403f78c464de0844e12bbc3d3cdaaaaafb"} Mar 19 00:42:02 crc kubenswrapper[4745]: I0319 00:42:02.799085 4745 generic.go:334] "Generic (PLEG): container finished" podID="cb19dc97-c11c-4239-9119-66b62533468d" containerID="989ddcb40e12489a6193192610525ba17add6f9232e7834040bccd51c493446e" exitCode=0 Mar 19 00:42:02 crc kubenswrapper[4745]: I0319 00:42:02.799510 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564682-qhdlb" event={"ID":"cb19dc97-c11c-4239-9119-66b62533468d","Type":"ContainerDied","Data":"989ddcb40e12489a6193192610525ba17add6f9232e7834040bccd51c493446e"} Mar 19 00:42:04 crc kubenswrapper[4745]: I0319 00:42:04.065614 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564682-qhdlb" Mar 19 00:42:04 crc kubenswrapper[4745]: I0319 00:42:04.210920 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5w9r\" (UniqueName: \"kubernetes.io/projected/cb19dc97-c11c-4239-9119-66b62533468d-kube-api-access-q5w9r\") pod \"cb19dc97-c11c-4239-9119-66b62533468d\" (UID: \"cb19dc97-c11c-4239-9119-66b62533468d\") " Mar 19 00:42:04 crc kubenswrapper[4745]: I0319 00:42:04.231989 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb19dc97-c11c-4239-9119-66b62533468d-kube-api-access-q5w9r" (OuterVolumeSpecName: "kube-api-access-q5w9r") pod "cb19dc97-c11c-4239-9119-66b62533468d" (UID: "cb19dc97-c11c-4239-9119-66b62533468d"). InnerVolumeSpecName "kube-api-access-q5w9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:42:04 crc kubenswrapper[4745]: I0319 00:42:04.312869 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5w9r\" (UniqueName: \"kubernetes.io/projected/cb19dc97-c11c-4239-9119-66b62533468d-kube-api-access-q5w9r\") on node \"crc\" DevicePath \"\"" Mar 19 00:42:04 crc kubenswrapper[4745]: I0319 00:42:04.827093 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564682-qhdlb" event={"ID":"cb19dc97-c11c-4239-9119-66b62533468d","Type":"ContainerDied","Data":"9eb418b15c5055fa4b6c8739ea732b403f78c464de0844e12bbc3d3cdaaaaafb"} Mar 19 00:42:04 crc kubenswrapper[4745]: I0319 00:42:04.827450 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eb418b15c5055fa4b6c8739ea732b403f78c464de0844e12bbc3d3cdaaaaafb" Mar 19 00:42:04 crc kubenswrapper[4745]: I0319 00:42:04.827360 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564682-qhdlb" Mar 19 00:42:05 crc kubenswrapper[4745]: I0319 00:42:05.144497 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564676-6jsmm"] Mar 19 00:42:05 crc kubenswrapper[4745]: I0319 00:42:05.150930 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564676-6jsmm"] Mar 19 00:42:06 crc kubenswrapper[4745]: I0319 00:42:06.148133 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b28c36f9-bd14-4de4-a0b3-e3f5e9131f60" path="/var/lib/kubelet/pods/b28c36f9-bd14-4de4-a0b3-e3f5e9131f60/volumes" Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.606440 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.607065 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.607122 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.607674 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"12dd2a3323471bce1c5eddd5688e4d60bfdf81298a99430d997466c1833e7d01"} pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.607732 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" containerID="cri-o://12dd2a3323471bce1c5eddd5688e4d60bfdf81298a99430d997466c1833e7d01" gracePeriod=600 Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.915317 4745 generic.go:334] "Generic (PLEG): container finished" podID="08615e98-17b3-40c9-8b9b-e372a9ca1b04" containerID="e08cee8d6016575c3039d3145b9b52e25c4da1ffbc14b55a1a247732682d9c5a" exitCode=0 Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.915414 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-45wdk/must-gather-j98n7" event={"ID":"08615e98-17b3-40c9-8b9b-e372a9ca1b04","Type":"ContainerDied","Data":"e08cee8d6016575c3039d3145b9b52e25c4da1ffbc14b55a1a247732682d9c5a"} Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.916307 4745 scope.go:117] "RemoveContainer" containerID="e08cee8d6016575c3039d3145b9b52e25c4da1ffbc14b55a1a247732682d9c5a" Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.922176 4745 generic.go:334] "Generic (PLEG): container finished" podID="400972f4-050f-4f26-b982-ced6f2590c8b" containerID="12dd2a3323471bce1c5eddd5688e4d60bfdf81298a99430d997466c1833e7d01" exitCode=0 Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.922229 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerDied","Data":"12dd2a3323471bce1c5eddd5688e4d60bfdf81298a99430d997466c1833e7d01"} Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.922279 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:42:16 crc kubenswrapper[4745]: I0319 00:42:16.318340 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-45wdk_must-gather-j98n7_08615e98-17b3-40c9-8b9b-e372a9ca1b04/gather/0.log" Mar 19 00:42:16 crc kubenswrapper[4745]: I0319 00:42:16.932559 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"26cf031263ba2ce39a17b9bc1f91a79f4c9bf57080b9988d3962cef68a16c6fa"} Mar 19 00:42:23 crc kubenswrapper[4745]: I0319 00:42:23.639094 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-45wdk/must-gather-j98n7"] Mar 19 00:42:23 crc kubenswrapper[4745]: I0319 00:42:23.640436 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-45wdk/must-gather-j98n7" podUID="08615e98-17b3-40c9-8b9b-e372a9ca1b04" containerName="copy" containerID="cri-o://9910ba3cd8a666005f7972e55a88ec614e51f3eed198ea44dfa464904fab77e2" gracePeriod=2 Mar 19 00:42:23 crc kubenswrapper[4745]: I0319 00:42:23.646630 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-45wdk/must-gather-j98n7"] Mar 19 00:42:23 crc kubenswrapper[4745]: I0319 00:42:23.999105 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-45wdk_must-gather-j98n7_08615e98-17b3-40c9-8b9b-e372a9ca1b04/copy/0.log" Mar 19 00:42:24 crc kubenswrapper[4745]: I0319 00:42:24.000086 4745 generic.go:334] "Generic (PLEG): container finished" podID="08615e98-17b3-40c9-8b9b-e372a9ca1b04" containerID="9910ba3cd8a666005f7972e55a88ec614e51f3eed198ea44dfa464904fab77e2" exitCode=143 Mar 19 00:42:24 crc kubenswrapper[4745]: I0319 00:42:24.092344 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-45wdk_must-gather-j98n7_08615e98-17b3-40c9-8b9b-e372a9ca1b04/copy/0.log" Mar 19 00:42:24 crc kubenswrapper[4745]: I0319 00:42:24.093120 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-45wdk/must-gather-j98n7" Mar 19 00:42:24 crc kubenswrapper[4745]: I0319 00:42:24.134093 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/08615e98-17b3-40c9-8b9b-e372a9ca1b04-must-gather-output\") pod \"08615e98-17b3-40c9-8b9b-e372a9ca1b04\" (UID: \"08615e98-17b3-40c9-8b9b-e372a9ca1b04\") " Mar 19 00:42:24 crc kubenswrapper[4745]: I0319 00:42:24.134220 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncd7b\" (UniqueName: \"kubernetes.io/projected/08615e98-17b3-40c9-8b9b-e372a9ca1b04-kube-api-access-ncd7b\") pod \"08615e98-17b3-40c9-8b9b-e372a9ca1b04\" (UID: \"08615e98-17b3-40c9-8b9b-e372a9ca1b04\") " Mar 19 00:42:24 crc kubenswrapper[4745]: I0319 00:42:24.143290 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08615e98-17b3-40c9-8b9b-e372a9ca1b04-kube-api-access-ncd7b" (OuterVolumeSpecName: "kube-api-access-ncd7b") pod "08615e98-17b3-40c9-8b9b-e372a9ca1b04" (UID: "08615e98-17b3-40c9-8b9b-e372a9ca1b04"). InnerVolumeSpecName "kube-api-access-ncd7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:42:24 crc kubenswrapper[4745]: I0319 00:42:24.209897 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08615e98-17b3-40c9-8b9b-e372a9ca1b04-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "08615e98-17b3-40c9-8b9b-e372a9ca1b04" (UID: "08615e98-17b3-40c9-8b9b-e372a9ca1b04"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:42:24 crc kubenswrapper[4745]: I0319 00:42:24.236139 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncd7b\" (UniqueName: \"kubernetes.io/projected/08615e98-17b3-40c9-8b9b-e372a9ca1b04-kube-api-access-ncd7b\") on node \"crc\" DevicePath \"\"" Mar 19 00:42:24 crc kubenswrapper[4745]: I0319 00:42:24.236212 4745 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/08615e98-17b3-40c9-8b9b-e372a9ca1b04-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 19 00:42:25 crc kubenswrapper[4745]: I0319 00:42:25.011897 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-45wdk_must-gather-j98n7_08615e98-17b3-40c9-8b9b-e372a9ca1b04/copy/0.log" Mar 19 00:42:25 crc kubenswrapper[4745]: I0319 00:42:25.013086 4745 scope.go:117] "RemoveContainer" containerID="9910ba3cd8a666005f7972e55a88ec614e51f3eed198ea44dfa464904fab77e2" Mar 19 00:42:25 crc kubenswrapper[4745]: I0319 00:42:25.013216 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-45wdk/must-gather-j98n7" Mar 19 00:42:25 crc kubenswrapper[4745]: I0319 00:42:25.046480 4745 scope.go:117] "RemoveContainer" containerID="e08cee8d6016575c3039d3145b9b52e25c4da1ffbc14b55a1a247732682d9c5a" Mar 19 00:42:26 crc kubenswrapper[4745]: I0319 00:42:26.147520 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08615e98-17b3-40c9-8b9b-e372a9ca1b04" path="/var/lib/kubelet/pods/08615e98-17b3-40c9-8b9b-e372a9ca1b04/volumes" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.528022 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pn5dd"] Mar 19 00:42:35 crc kubenswrapper[4745]: E0319 00:42:35.529289 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08615e98-17b3-40c9-8b9b-e372a9ca1b04" containerName="copy" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.529308 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="08615e98-17b3-40c9-8b9b-e372a9ca1b04" containerName="copy" Mar 19 00:42:35 crc kubenswrapper[4745]: E0319 00:42:35.529337 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08615e98-17b3-40c9-8b9b-e372a9ca1b04" containerName="gather" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.529346 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="08615e98-17b3-40c9-8b9b-e372a9ca1b04" containerName="gather" Mar 19 00:42:35 crc kubenswrapper[4745]: E0319 00:42:35.529370 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb19dc97-c11c-4239-9119-66b62533468d" containerName="oc" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.529379 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb19dc97-c11c-4239-9119-66b62533468d" containerName="oc" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.529519 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="08615e98-17b3-40c9-8b9b-e372a9ca1b04" containerName="copy" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.529531 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb19dc97-c11c-4239-9119-66b62533468d" containerName="oc" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.529553 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="08615e98-17b3-40c9-8b9b-e372a9ca1b04" containerName="gather" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.530721 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.545080 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pn5dd"] Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.630287 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhxqz\" (UniqueName: \"kubernetes.io/projected/512efe3d-191c-49b8-bd41-897707ccc697-kube-api-access-xhxqz\") pod \"redhat-operators-pn5dd\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.630630 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-catalog-content\") pod \"redhat-operators-pn5dd\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.630792 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-utilities\") pod \"redhat-operators-pn5dd\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.732242 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-catalog-content\") pod \"redhat-operators-pn5dd\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.732323 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-utilities\") pod \"redhat-operators-pn5dd\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.732372 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhxqz\" (UniqueName: \"kubernetes.io/projected/512efe3d-191c-49b8-bd41-897707ccc697-kube-api-access-xhxqz\") pod \"redhat-operators-pn5dd\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.732984 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-catalog-content\") pod \"redhat-operators-pn5dd\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.732984 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-utilities\") pod \"redhat-operators-pn5dd\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.756815 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhxqz\" (UniqueName: \"kubernetes.io/projected/512efe3d-191c-49b8-bd41-897707ccc697-kube-api-access-xhxqz\") pod \"redhat-operators-pn5dd\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.852175 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:36 crc kubenswrapper[4745]: I0319 00:42:36.168153 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pn5dd"] Mar 19 00:42:36 crc kubenswrapper[4745]: I0319 00:42:36.311456 4745 scope.go:117] "RemoveContainer" containerID="4de285e74e641eb21f2a1ee98f32c5f610d3c8d1a0fc10bb8a444c82e684e43e" Mar 19 00:42:37 crc kubenswrapper[4745]: I0319 00:42:37.147962 4745 generic.go:334] "Generic (PLEG): container finished" podID="512efe3d-191c-49b8-bd41-897707ccc697" containerID="a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d" exitCode=0 Mar 19 00:42:37 crc kubenswrapper[4745]: I0319 00:42:37.148656 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pn5dd" event={"ID":"512efe3d-191c-49b8-bd41-897707ccc697","Type":"ContainerDied","Data":"a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d"} Mar 19 00:42:37 crc kubenswrapper[4745]: I0319 00:42:37.148713 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pn5dd" event={"ID":"512efe3d-191c-49b8-bd41-897707ccc697","Type":"ContainerStarted","Data":"58c40ca0c8930bf8462ccde33f2505afcaa5f046f05f364b86a83cb6df27a1e3"} Mar 19 00:42:39 crc kubenswrapper[4745]: I0319 00:42:39.177864 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pn5dd" event={"ID":"512efe3d-191c-49b8-bd41-897707ccc697","Type":"ContainerStarted","Data":"5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719"} Mar 19 00:42:40 crc kubenswrapper[4745]: I0319 00:42:40.187498 4745 generic.go:334] "Generic (PLEG): container finished" podID="512efe3d-191c-49b8-bd41-897707ccc697" containerID="5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719" exitCode=0 Mar 19 00:42:40 crc kubenswrapper[4745]: I0319 00:42:40.187569 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pn5dd" event={"ID":"512efe3d-191c-49b8-bd41-897707ccc697","Type":"ContainerDied","Data":"5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719"} Mar 19 00:42:41 crc kubenswrapper[4745]: I0319 00:42:41.200216 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pn5dd" event={"ID":"512efe3d-191c-49b8-bd41-897707ccc697","Type":"ContainerStarted","Data":"185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b"} Mar 19 00:42:41 crc kubenswrapper[4745]: I0319 00:42:41.225458 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pn5dd" podStartSLOduration=2.762867799 podStartE2EDuration="6.225430783s" podCreationTimestamp="2026-03-19 00:42:35 +0000 UTC" firstStartedPulling="2026-03-19 00:42:37.154694911 +0000 UTC m=+2121.692890042" lastFinishedPulling="2026-03-19 00:42:40.617257895 +0000 UTC m=+2125.155453026" observedRunningTime="2026-03-19 00:42:41.218706328 +0000 UTC m=+2125.756901479" watchObservedRunningTime="2026-03-19 00:42:41.225430783 +0000 UTC m=+2125.763625914" Mar 19 00:42:45 crc kubenswrapper[4745]: I0319 00:42:45.853179 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:45 crc kubenswrapper[4745]: I0319 00:42:45.853965 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:46 crc kubenswrapper[4745]: I0319 00:42:46.904837 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pn5dd" podUID="512efe3d-191c-49b8-bd41-897707ccc697" containerName="registry-server" probeResult="failure" output=< Mar 19 00:42:46 crc kubenswrapper[4745]: timeout: failed to connect service ":50051" within 1s Mar 19 00:42:46 crc kubenswrapper[4745]: > Mar 19 00:42:55 crc kubenswrapper[4745]: I0319 00:42:55.913867 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:55 crc kubenswrapper[4745]: I0319 00:42:55.963147 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:56 crc kubenswrapper[4745]: I0319 00:42:56.180903 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pn5dd"] Mar 19 00:42:57 crc kubenswrapper[4745]: I0319 00:42:57.331559 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pn5dd" podUID="512efe3d-191c-49b8-bd41-897707ccc697" containerName="registry-server" containerID="cri-o://185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b" gracePeriod=2 Mar 19 00:42:57 crc kubenswrapper[4745]: I0319 00:42:57.727450 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:57 crc kubenswrapper[4745]: I0319 00:42:57.804530 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-catalog-content\") pod \"512efe3d-191c-49b8-bd41-897707ccc697\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " Mar 19 00:42:57 crc kubenswrapper[4745]: I0319 00:42:57.804625 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-utilities\") pod \"512efe3d-191c-49b8-bd41-897707ccc697\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " Mar 19 00:42:57 crc kubenswrapper[4745]: I0319 00:42:57.805910 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-utilities" (OuterVolumeSpecName: "utilities") pod "512efe3d-191c-49b8-bd41-897707ccc697" (UID: "512efe3d-191c-49b8-bd41-897707ccc697"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:42:57 crc kubenswrapper[4745]: I0319 00:42:57.806197 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhxqz\" (UniqueName: \"kubernetes.io/projected/512efe3d-191c-49b8-bd41-897707ccc697-kube-api-access-xhxqz\") pod \"512efe3d-191c-49b8-bd41-897707ccc697\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " Mar 19 00:42:57 crc kubenswrapper[4745]: I0319 00:42:57.807822 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:42:57 crc kubenswrapper[4745]: I0319 00:42:57.811991 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/512efe3d-191c-49b8-bd41-897707ccc697-kube-api-access-xhxqz" (OuterVolumeSpecName: "kube-api-access-xhxqz") pod "512efe3d-191c-49b8-bd41-897707ccc697" (UID: "512efe3d-191c-49b8-bd41-897707ccc697"). InnerVolumeSpecName "kube-api-access-xhxqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:42:57 crc kubenswrapper[4745]: I0319 00:42:57.908951 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhxqz\" (UniqueName: \"kubernetes.io/projected/512efe3d-191c-49b8-bd41-897707ccc697-kube-api-access-xhxqz\") on node \"crc\" DevicePath \"\"" Mar 19 00:42:57 crc kubenswrapper[4745]: I0319 00:42:57.949522 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "512efe3d-191c-49b8-bd41-897707ccc697" (UID: "512efe3d-191c-49b8-bd41-897707ccc697"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.010434 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.343712 4745 generic.go:334] "Generic (PLEG): container finished" podID="512efe3d-191c-49b8-bd41-897707ccc697" containerID="185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b" exitCode=0 Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.343804 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.343830 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pn5dd" event={"ID":"512efe3d-191c-49b8-bd41-897707ccc697","Type":"ContainerDied","Data":"185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b"} Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.344691 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pn5dd" event={"ID":"512efe3d-191c-49b8-bd41-897707ccc697","Type":"ContainerDied","Data":"58c40ca0c8930bf8462ccde33f2505afcaa5f046f05f364b86a83cb6df27a1e3"} Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.344727 4745 scope.go:117] "RemoveContainer" containerID="185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.617776 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pn5dd"] Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.618331 4745 scope.go:117] "RemoveContainer" containerID="5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.624612 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pn5dd"] Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.643318 4745 scope.go:117] "RemoveContainer" containerID="a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.667553 4745 scope.go:117] "RemoveContainer" containerID="185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b" Mar 19 00:42:58 crc kubenswrapper[4745]: E0319 00:42:58.668117 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b\": container with ID starting with 185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b not found: ID does not exist" containerID="185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.668155 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b"} err="failed to get container status \"185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b\": rpc error: code = NotFound desc = could not find container \"185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b\": container with ID starting with 185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b not found: ID does not exist" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.668184 4745 scope.go:117] "RemoveContainer" containerID="5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719" Mar 19 00:42:58 crc kubenswrapper[4745]: E0319 00:42:58.668448 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719\": container with ID starting with 5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719 not found: ID does not exist" containerID="5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.668667 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719"} err="failed to get container status \"5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719\": rpc error: code = NotFound desc = could not find container \"5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719\": container with ID starting with 5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719 not found: ID does not exist" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.668691 4745 scope.go:117] "RemoveContainer" containerID="a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d" Mar 19 00:42:58 crc kubenswrapper[4745]: E0319 00:42:58.668921 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d\": container with ID starting with a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d not found: ID does not exist" containerID="a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.668947 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d"} err="failed to get container status \"a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d\": rpc error: code = NotFound desc = could not find container \"a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d\": container with ID starting with a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d not found: ID does not exist" Mar 19 00:43:00 crc kubenswrapper[4745]: I0319 00:43:00.149601 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="512efe3d-191c-49b8-bd41-897707ccc697" path="/var/lib/kubelet/pods/512efe3d-191c-49b8-bd41-897707ccc697/volumes" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.152597 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564684-ltrd8"] Mar 19 00:44:00 crc kubenswrapper[4745]: E0319 00:44:00.153738 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512efe3d-191c-49b8-bd41-897707ccc697" containerName="extract-utilities" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.153761 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="512efe3d-191c-49b8-bd41-897707ccc697" containerName="extract-utilities" Mar 19 00:44:00 crc kubenswrapper[4745]: E0319 00:44:00.153777 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512efe3d-191c-49b8-bd41-897707ccc697" containerName="extract-content" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.153784 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="512efe3d-191c-49b8-bd41-897707ccc697" containerName="extract-content" Mar 19 00:44:00 crc kubenswrapper[4745]: E0319 00:44:00.153807 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512efe3d-191c-49b8-bd41-897707ccc697" containerName="registry-server" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.153813 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="512efe3d-191c-49b8-bd41-897707ccc697" containerName="registry-server" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.154047 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="512efe3d-191c-49b8-bd41-897707ccc697" containerName="registry-server" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.155616 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564684-ltrd8" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.159075 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.159094 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.159079 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.161298 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564684-ltrd8"] Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.257376 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qfcj\" (UniqueName: \"kubernetes.io/projected/be744c99-b821-4dc8-8e93-92afcd2ca04a-kube-api-access-6qfcj\") pod \"auto-csr-approver-29564684-ltrd8\" (UID: \"be744c99-b821-4dc8-8e93-92afcd2ca04a\") " pod="openshift-infra/auto-csr-approver-29564684-ltrd8" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.359198 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qfcj\" (UniqueName: \"kubernetes.io/projected/be744c99-b821-4dc8-8e93-92afcd2ca04a-kube-api-access-6qfcj\") pod \"auto-csr-approver-29564684-ltrd8\" (UID: \"be744c99-b821-4dc8-8e93-92afcd2ca04a\") " pod="openshift-infra/auto-csr-approver-29564684-ltrd8" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.381013 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qfcj\" (UniqueName: \"kubernetes.io/projected/be744c99-b821-4dc8-8e93-92afcd2ca04a-kube-api-access-6qfcj\") pod \"auto-csr-approver-29564684-ltrd8\" (UID: \"be744c99-b821-4dc8-8e93-92afcd2ca04a\") " pod="openshift-infra/auto-csr-approver-29564684-ltrd8" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.480648 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564684-ltrd8" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.945241 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564684-ltrd8"] Mar 19 00:44:01 crc kubenswrapper[4745]: I0319 00:44:01.298047 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564684-ltrd8" event={"ID":"be744c99-b821-4dc8-8e93-92afcd2ca04a","Type":"ContainerStarted","Data":"ee71b4cb1ead71c102b39038fdf07cac65c3dd0a4c4d36b10b9ba5fc46461cd9"} Mar 19 00:44:02 crc kubenswrapper[4745]: I0319 00:44:02.308960 4745 generic.go:334] "Generic (PLEG): container finished" podID="be744c99-b821-4dc8-8e93-92afcd2ca04a" containerID="95dd962052ffe49051aaddb3d90230a87d8c24e2d8b45a7954e0d61fef0111b0" exitCode=0 Mar 19 00:44:02 crc kubenswrapper[4745]: I0319 00:44:02.309105 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564684-ltrd8" event={"ID":"be744c99-b821-4dc8-8e93-92afcd2ca04a","Type":"ContainerDied","Data":"95dd962052ffe49051aaddb3d90230a87d8c24e2d8b45a7954e0d61fef0111b0"} Mar 19 00:44:03 crc kubenswrapper[4745]: I0319 00:44:03.599795 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564684-ltrd8" Mar 19 00:44:03 crc kubenswrapper[4745]: I0319 00:44:03.718752 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qfcj\" (UniqueName: \"kubernetes.io/projected/be744c99-b821-4dc8-8e93-92afcd2ca04a-kube-api-access-6qfcj\") pod \"be744c99-b821-4dc8-8e93-92afcd2ca04a\" (UID: \"be744c99-b821-4dc8-8e93-92afcd2ca04a\") " Mar 19 00:44:03 crc kubenswrapper[4745]: I0319 00:44:03.724297 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be744c99-b821-4dc8-8e93-92afcd2ca04a-kube-api-access-6qfcj" (OuterVolumeSpecName: "kube-api-access-6qfcj") pod "be744c99-b821-4dc8-8e93-92afcd2ca04a" (UID: "be744c99-b821-4dc8-8e93-92afcd2ca04a"). InnerVolumeSpecName "kube-api-access-6qfcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:44:03 crc kubenswrapper[4745]: I0319 00:44:03.821225 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qfcj\" (UniqueName: \"kubernetes.io/projected/be744c99-b821-4dc8-8e93-92afcd2ca04a-kube-api-access-6qfcj\") on node \"crc\" DevicePath \"\"" Mar 19 00:44:04 crc kubenswrapper[4745]: I0319 00:44:04.325763 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564684-ltrd8" event={"ID":"be744c99-b821-4dc8-8e93-92afcd2ca04a","Type":"ContainerDied","Data":"ee71b4cb1ead71c102b39038fdf07cac65c3dd0a4c4d36b10b9ba5fc46461cd9"} Mar 19 00:44:04 crc kubenswrapper[4745]: I0319 00:44:04.325817 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee71b4cb1ead71c102b39038fdf07cac65c3dd0a4c4d36b10b9ba5fc46461cd9" Mar 19 00:44:04 crc kubenswrapper[4745]: I0319 00:44:04.325819 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564684-ltrd8" Mar 19 00:44:04 crc kubenswrapper[4745]: I0319 00:44:04.679309 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564678-m5hsl"] Mar 19 00:44:04 crc kubenswrapper[4745]: I0319 00:44:04.686201 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564678-m5hsl"] Mar 19 00:44:06 crc kubenswrapper[4745]: I0319 00:44:06.148521 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3ff29f0-5d28-4572-bd21-aac2f86091a8" path="/var/lib/kubelet/pods/c3ff29f0-5d28-4572-bd21-aac2f86091a8/volumes" Mar 19 00:44:15 crc kubenswrapper[4745]: I0319 00:44:15.606480 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:44:15 crc kubenswrapper[4745]: I0319 00:44:15.607270 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:44:36 crc kubenswrapper[4745]: I0319 00:44:36.446996 4745 scope.go:117] "RemoveContainer" containerID="97740a413cb8dbbe5a9163cc3b0f901c899139f75274026549c0e4cc3732d413" Mar 19 00:44:45 crc kubenswrapper[4745]: I0319 00:44:45.606459 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:44:45 crc kubenswrapper[4745]: I0319 00:44:45.607204 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"